Thursday, December 27, 2012

World Cup 2010 Celebration Inspiration (Marco Tardelli)

Check out this video on YouTube:

Sent from my iPhone

Joyce Lee Malcolm: Two Cautionary Tales of Gun Control -

Joyce Lee Malcolm: Two Cautionary Tales of Gun Control


Americans are determined that massacres such as happened in Newtown, Conn., never happen again. But how? Many advocate more effective treatment of mentally-ill people or armed protection in so-called gun-free zones. Many others demand stricter control of firearms.

We aren't alone in facing this problem. Great Britain and Australia, for example, suffered mass shootings in the 1980s and 1990s. Both countries had very stringent gun laws when they occurred. Nevertheless, both decided that even stricter control of guns was the answer. Their experiences can be instructive.

In 1987, Michael Ryan went on a shooting spree in his small town of Hungerford, England, killing 16 people (including his mother) and wounding another 14 before shooting himself. Since the public was unarmed—as were the police—Ryan wandered the streets for eight hours with two semiautomatic rifles and a handgun before anyone with a firearm was able to come to the rescue.

Nine years later, in March 1996, Thomas Hamilton, a man known to be mentally unstable, walked into a primary school in the Scottish town of Dunblane and shot 16 young children and their teacher. He wounded 10 other children and three other teachers before taking his own life.

David Klein

Since 1920, anyone in Britain wanting a handgun had to obtain a certificate from his local police stating he was fit to own a weapon and had good reason to have one. Over the years, the definition of "good reason" gradually narrowed. By 1969, self-defense was never a good reason for a permit.

After Hungerford, the British government banned semiautomatic rifles and brought shotguns—the last type of firearm that could be purchased with a simple show of fitness—under controls similar to those in place for pistols and rifles. Magazines were limited to two shells with a third in the chamber.

Dunblane had a more dramatic impact. Hamilton had a firearm certificate, although according to the rules he should not have been granted one. A media frenzy coupled with an emotional campaign by parents of Dunblane resulted in the Firearms Act of 1998, which instituted a nearly complete ban on handguns. Owners of pistols were required to turn them in. The penalty for illegal possession of a pistol is up to 10 years in prison.

The results have not been what proponents of the act wanted. Within a decade of the handgun ban and the confiscation of handguns from registered owners, crime with handguns had doubled according to British government crime reports. Gun crime, not a serious problem in the past, now is. Armed street gangs have some British police carrying guns for the first time. Moreover, another massacre occurred in June 2010. Derrick Bird, a taxi driver in Cumbria, shot his brother and a colleague then drove off through rural villages killing 12 people and injuring 11 more before killing himself.

Meanwhile, law-abiding citizens who have come into the possession of a firearm, even accidentally, have been harshly treated. In 2009 a former soldier, Paul Clarke, found a bag in his garden containing a shotgun. He brought it to the police station and was immediately handcuffed and charged with possession of the gun. At his trial the judge noted: "In law there is no dispute that Mr. Clarke has no defence to this charge. The intention of anybody possessing a firearm is irrelevant." Mr. Clarke was sentenced to five years in prison. A public outcry eventually won his release.

In November of this year, Danny Nightingale, member of a British special forces unit in Iraq and Afghanistan, was sentenced to 18 months in military prison for possession of a pistol and ammunition. Sgt. Nightingale was given the Glock pistol as a gift by Iraqi forces he had been training. It was packed up with his possessions and returned to him by colleagues in Iraq after he left the country to organize a funeral for two close friends killed in action. Mr. Nightingale pleaded guilty to avoid a five-year sentence and was in prison until an appeal and public outcry freed him on Nov. 29.


Six weeks after the Dunblane massacre in 1996, Martin Bryant, an Australian with a lifelong history of violence, attacked tourists at a Port Arthur prison site in Tasmania with two semiautomatic rifles. He killed 35 people and wounded 21 others.

At the time, Australia's guns laws were stricter than the United Kingdom's. In lieu of the requirement in Britain that an applicant for permission to purchase a gun have a "good reason," Australia required a "genuine reason." Hunting and protecting crops from feral animals were genuine reasons—personal protection wasn't.

With new Prime Minister John Howard in the lead, Australia passed the National Firearms Agreement, banning all semiautomatic rifles and semiautomatic and pump-action shotguns and imposing a more restrictive licensing system on other firearms. The government also launched a forced buyback scheme to remove thousands of firearms from private hands. Between Oct. 1, 1996, and Sept. 30, 1997, the government purchased and destroyed more than 631,000 of the banned guns at a cost of $500 million.

To what end? While there has been much controversy over the result of the law and buyback, Peter Reuter and Jenny Mouzos, in a 2003 study published by the Brookings Institution, found homicides "continued a modest decline" since 1997. They concluded that the impact of the National Firearms Agreement was "relatively small," with the daily rate of firearms homicides declining 3.2%.

According to their study, the use of handguns rather than long guns (rifles and shotguns) went up sharply, but only one out of 117 gun homicides in the two years following the 1996 National Firearms Agreement used a registered gun. Suicides with firearms went down but suicides by other means went up. They reported "a modest reduction in the severity" of massacres (four or more indiscriminate homicides) in the five years since the government weapons buyback. These involved knives, gas and arson rather than firearms.

In 2008, the Australian Institute of Criminology reported a decrease of 9% in homicides and a one-third decrease in armed robbery since the 1990s, but an increase of over 40% in assaults and 20% in sexual assaults.

What to conclude? Strict gun laws in Great Britain and Australia haven't made their people noticeably safer, nor have they prevented massacres. The two major countries held up as models for the U.S. don't provide much evidence that strict gun laws will solve our problems.

Ms. Malcolm, a professor of law at George Mason University Law School, is the author of several books including "Guns and Violence: The English Experience," (Harvard, 2002).

A version of this article appeared December 27, 2012, on page A13 in the U.S. edition of The Wall Street Journal, with the headline: Two Cautionary Tales of Gun Control.

Sent from my iPad

Thursday, December 20, 2012

Bookworm Room » Supporting gun-control is racist

Supporting gun-control is racist

In America today, especially in America's media, the worst thing you can call someone is "racist."  In our Obama-era people who oppose Obama are racist; people who support the Constitution are racist; people who use the word "Chicago" are racist; people who comment about the president's lean physique (unless they're drooling female reporters) are racist; and –here's the kicker — people who oppose gun control are racist.

Why is opposing gun control racist?  Because blacks are proportionately much more likely to find themselves at the wrong end of the gun in America than are whites.  Even though blacks comprise only 13% of the population, in 2007 alone black death rates due to guns were more than twice white death rates.  Put another way:

Black teenager boy killed by gun

Young black males die from gun violence at a rate 2.5 times higher than Latino males, and eight times higher than white males. Gun injuries are suffered by black teens at a rate ten times higher than white teens.

Guns are an extraordinary scourge within the black community — a fact that also explains why America's gun homicides are (a) high and (b) unequally spread geographically.  A vast proportion of gun crime is inner city crime.

Given these appalling statistics — black men being mowed down by a plague of bullets — liberals say that anyone who doesn't want to remove guns from the street is a fortiori a racist.  The logic is simple:  guns kill black people; Republicans, who are disproportionately white, resist any form of gun control; therefore Republicans hate black people.

The problem with the above liberal analysis is that it's an entirely false syllogism.  Here's the truth:  what's killing black men is gun control.  Black men live in dangerous areas and we have disarmed them.

Don't believe me?  The numbers back me up.

Chicago protest Don't shoot I want to grow up

Chicago is now, and has long been, a deadly city for blacks.  The sweetness and light of non-stop Democrat rule has done nothing to make it safer.  At a certain point in this deadly trajectory, Chicago Progressives made an announcement:  a lot of the people who died in Chicago died from gunshots.  Thinking simplistically, they decided that the next step was to get rid of the guns.  Chicago therefore enacted some of the most repressive gun-control in the nation.  Had the Democrat logic been correct, the "homicide by gun" rate in Chicago should have plummeted in the wake of this legislation.  As John Lott explains, the opposite was true:

Since late 1982, Chicago has banned the private ownership of handguns. Over the next 19 years, there were only three years where the murder rate was as low as when the ban started.

As shown in the forthcoming third edition of my book "More Guns, Less Crime," before the ban, Chicago's murder rate was falling relative to the nine other largest cities, the 50 largest cities, the five counties that border Cook county, as well as the U.S. as a whole. After the ban, Chicago's murder rate rose relative to all these other places.

In other words, banning guns killed black men.

Chicago is not anomalous.  Washington, D.C., showed precisely the same pattern.  Here's John Lott again, looking at the way the numbers are, rather than the way Progressives think they ought to be.  When Washington banned legal guns, murder rates (and that would be murder rates of black men) shot up:

Washington's murder rate soared after its handgun ban went into effect in early 1977 (there is only one year while the ban was in effect that the murder rate fell below the1976 number and that happened many years later — in 1985). Its murder rate also rose relative to other cities. Washington's murder rate rose from 12 percent above the average for the 50 most populous cities in 1976 to 35 percent above the average in 1986.

In 2008, fed-up citizens, aware of their rights under the Second Amendment, sued.  A fairly conservative Supreme Court looked at the Washington, D.C., law, and concluded that it did indeed violate the Constitution.  At this point, with the gun-control ban lifted, if the liberals had been right, even more young black men should have died.  But, says Lott, the numbers showed that the opposite happened:

When the Heller case was decided, Washington's Mayor Adrian Fenty warned: "More handguns in the District of Columbia will only lead to more handgun violence." Knowing that Chicago's gun laws would soon face a similar legal challenge, Mayor Richard Daley was particularly vocal. The day that the Heller decision was handed down, Daley said that he and other mayors across the country were "outraged" by the decision and he predicted more deaths along with Wild West-style shootouts. Daley warned that people "are going to take a gun and they are going to end their lives in a family dispute."

But Armageddon never arrived. Quite the contrary, murders in Washington plummeted by an astounding 25 percent in 2009, dropping from 186 murders in 2008 to 140. That translates to a murder rate that is now down to 23.5 per 100,000 people, Washinton's lowest since 1967. While other cities have also fared well over the last year, D.C.'s drop was several times greater than that for other similar sized cities. According to preliminary estimates by the FBI, nationwide murders fell by a relatively more modest 10 percent last year and by about 8 percent in other similarly sized cities of half a million to one million people (D.C.'s population count is at about 590,000).

In words of one syllable (for any Progressives reading this): If you take guns from the good guys, bad guys kill them.

In Washington, D.C., and in Chicago, both good guys and bad guys were black.  When the Progressives told law-abiding black citizens that they would have to disarm, these same citizens died in greater numbers than before.  Keep in mind, too, that the gun ban also affected the bad guys.  Bans create black-market arms' races, with the bad guys working hard to corner the market and, because they are working outside the law, having to compunction about  killing their competition.

The negative effects of gun control transcend race, of course.  In England, the BBC expressed genuine surprise when statistics showed that, despite a repressive ban on all types of guns, gun deaths increased:

A new study suggests the use of handguns in crime rose by 40% in the two years after the weapons were banned.

The research, commissioned by the Countryside Alliance's Campaign for Shooting, has concluded that existing laws are targeting legitimate users of firearms rather than criminals.

The ban on ownership of handguns was introduced in 1997 as a result of the Dunblane massacre, when Thomas Hamilton opened fire at a primary school leaving 16 children and their teacher dead.

But the report suggests that despite the restrictions on ownership the use of handguns in crime is rising.

At this point, people who are not blinkered by statist ideology and an irrational fear of guns are saying "Well, duh!" But the Leftists, God bless 'em, are always driven by an unreasoning optimism that says, "If at first you don't succeed, try imposing even more state control and spending more money."

It doesn't seem to occur to these Ivy League geniuses that exerting more control and spending more money make sense only if the gun ban sort of succeeded in the first place by slightly lessening gun-related homicides.  In that case, maybe trying harder might create greater benefits. However, if your actual outcomes are the exact opposite of the intended outcomes, it might occur to any rational person that you're on the wrong track and should make a sharp, fast u-turn.

Lynching in the Democrat South

By the way, it's reasonable to believe that, in America, the increase in black deaths isn't a bug, it's a feature. Progressives have a long and ugly history of racism. In the years leading up to the Civil War, the Democrats were the party of slavery. In the years after the Civil War, right up until the Civil Rights Movement, they were the party of the KKK, lynchings, and Jim Crow. In the North, the Progressives weren't crude enough to agitate for lynching. They aimed for scientific eugenics, with Margaret Sanger's primary goal when it came to birth control being the eventual elimination of the black population. Abortion, which is one of Planned Parenthood's primary services, also has a disproportionate effect on blacks.

And that brings us to gun control, the genesis of which wasn't to prevent crime but was, instead, to disarm the black man:

The historical record provides compelling evidence that racism underlies gun control laws — and not in any subtle way. Throughout much of American history, gun control was openly stated as a method for keeping blacks and Hispanics "in their place," and to quiet the racial fears of whites. This paper is intended to provide a brief summary of this unholy alliance of gun control and racism, and to suggest that gun control laws should be regarded as "suspect ideas," analogous to the "suspect classifications" theory of discrimination already part of the American legal system.

Nowadays, I acquit modern Progressives of active genocide, but there's no doubt that, by following in the footsteps of their racist forebears, they are having equally racist outcomes.

It turns out that there's one way to stop gun-crime:  more legal guns that are in the hands of law-abiding, rather than law-breaking, citizens:

Luckily, some years ago, two famed economists, William Landes at the University of Chicago and John Lott at Yale, conducted a massive study of multiple victim public shootings in the United States between 1977 and 1995 to see how various legal changes affected their frequency and death toll.

Landes and Lott examined many of the very policies being proposed right now in response to the Connecticut massacre: waiting periods and background checks for guns, the death penalty and increased penalties for committing a crime with a gun.

None of these policies had any effect on the frequency of, or carnage from, multiple-victim shootings. (I note that they did not look at reforming our lax mental health laws, presumably because the ACLU is working to keep dangerous nuts on the street in all 50 states.)

Only one public policy has ever been shown to reduce the death rate from such crimes: concealed-carry laws.

Their study controlled for age, sex, race, unemployment, retirement, poverty rates, state population, murder arrest rates, violent crime rates, and on and on.

The effect of concealed-carry laws in deterring mass public shootings was even greater than the impact of such laws on the murder rate generally.

Concealed-carry works best in limiting the number of deaths in a mass murder scenario, but it also is very helpful in preventing crimes generally. A bad guy who thinks that the owner of House A might be armed is more likely to go to House B, as long as House B has this sign in the window:

No guns

Of course, if both Houses A and B have this sign in the window, I bet the bad guy will go away altogether:

NRA sign never mind dog beware of owner

So, the next time some condescending or faux-outraged Progressive asks you how in the world you can oppose gun control, tell him that it's because, unlike him, you're not a racist.

You Might Also Like

Tags: , , ,

Email This Post To A Friend Email This Post To A Friend


Sent from my iPhone

Thoughts On Gun Control From The Late Paul Harvey

Wolf Howling
From Paul Harvey, written in 2000:

Are you considering backing gun control laws? Do you think that because you may not own a gun, the rights guaranteed by the Second Amendment don't matter?


- In 1929 the Soviet Union established gun control. From 1929 to 1953, approximately 20 million dissidents, unable to defend themselves, were rounded up and exterminated.

- In 1911, Turkey established gun control. From 1915-1917, 1.5 million Armenians, unable to defend themselves, were rounded up and exterminated.

- Germany established gun control in 1938 and from 1939 to 1945, 13 million Jews, gypsies, homosexuals, the mentally ill, and others, who were unable to defend themselves, were rounded up and exterminated.

- China established gun control in 1935. From 1948 to 1952, 20 million political dissidents, unable to defend themselves, were rounded up and exterminated.

- Guatemala established gun control in 1964. From 1964 to 1981, 100,000 Mayan Indians, unable to defend themselves, were rounded up and exterminated.

- Uganda established gun control in 1970. From 1971 to 1979, 300,000 Christians, unable to defend themselves, were rounded up and exterminated.

- Cambodia established gun control in 1956. From 1975 to 1977, one million "educated" people, unable to defend themselves, were rounded up and exterminated.

That places total victims who lost their lives because of gun control at approximately 56 million in the last century. Since we should learn from the mistakes of history, the next time someone talks in favor of gun control, find out which group of citizens they wish to have exterminated. . . .

Put simply, gun control is a means of insuring that targeted populations cannot defend themselves against government oppression. Indeed, in our nation, gun control started in states controlled by Democrats as a means of insuring that the black population would not be armed.


Sent with Reeder

Sent from my iPad

Tuesday, December 18, 2012

RealClearMarkets - Enlisting EA Sports To Save Us From the Federal Reserve

Enlisting EA Sports To Save Us From the Federal Reserve

As most readers know, EA Sports is the wildly successful creator of home video games that allow the sports fan to "coach" his or her team of choice. Along somewhat similar lines, Fantasy Football leagues offer the same fan the chance to act as general manager.

The beauty of both is that any damage is limited to the individual. Whatever the vanity or ego of the participant, bold ideas about how to win at Madden 2012 or in the office fantasy pool don't harm the innocent.

The same can't be said for the vain actions of the Ben Bernanke-led Federal Reserve. Bernanke combines a stunning lack of self-awareness with a tragically arrogant sense of genius that has him performing myriad experiments backed by the deep-pocketed Fed; the difference here that Bernanke's medicinals make us all ill.

Regarding last week's Fed meeting and the subsequent floating of the Fed's latest plan of action, frightened readers can at least be confident that it won't work. Implicit in Bernanke's promise that the Fed will cease buying bonds when the unemployment rate reaches 6.5% is the not-so-hidden truth that so long as Bernanke's at the Fed, quantitative easing will be the rule.

That's the case given the inescapable reality that there are no jobs without investment first. Just as capital gains taxes are a barrier to investment for them reducing any potential returns, the devaluation of the dollar which is the goal of quantitative easing is a similar tax on the investment that leads to company formation and hiring. Job-creating investors are buying future dollar income streams, and devaluation is a deterrent.

Put plainly, quantitative easing will continue because it's anti-job creation. At best, and this assumes the previously made predictions are wrong, 6.5% unemployment will be "achieved" for driving down the real cost of labor so much that hiring becomes incredibly cheap. In short, QE will only work insofar as it makes the paychecks earned from labor extraordinarily unattractive.

Considering the alleged wrinkle whereby Bernanke says he will tie QE to the rate unemployment, here he's simply acknowledging what's always been true about him being wedded to the Phillips Curve. The latter model suggests that economic growth is the driver of inflation; Bernanke's point with 6.5% unemployment that anything below it will stoke the flames of inflation.

The problem there is that Phillips Curve thinking is completely bogus; so bogus that any reader could with a few minutes of thought explain why low unemployment would in no way drive up inflation. If any reader is stuck, below is a quick explanation:

Whether I'm buying a movie ticket, plane ticket, filling my car with gasoline made expensive by policies in favor of devaluation, or depositing an increasingly devalued paycheck at the bank, I never anymore deal with a live human being. Thanks to innovations of the free market variety, the wage pressures presumed by Bernanke's Phillips Curve model have been rendered meaningless.

Assuming no innovation, and the Fed's devaluations tautologically signal less innovation (remember, entrepreneurs are reliant on investment), there would still be no wage pressures wrought by low unemployment in the States. That is so precisely because the United States is not an island. Instead, producers here interact with the global labor force, and any labor shortages stateside will be erased by eager workers who don't reside in these fifty states. To offer up but one of countless examples, Boeing's 787 Dreamliner is presently being manufactured in seven different countries around the world.

The Bernanke Fed falsely believes that inflation is caused by too much growth, but the reality is the opposite. A strong dollar is the antithesis of inflation, it's what attracts investment, and investment is what brings unemployment down. One reason unemployment is high today has to with the fact that the dollar is very weak (think inflationary - gasoline and food costs are spiking for a reason), and the weak dollar once again is a repellent to job-creating investment.

Back to a non-sports way for EA Sports to save us from the Fed, how about manufacturing a video game for the central bankers and their enablers in the economics profession whose machinations so cruelly weigh on economic growth? Anything to occupy their minds so that they stop causing so much damage.

To keep them engaged, EA's designers could even tweak the game so that quite unlike the real world where adolescent attempts to stimulate growth through the buying of bonds never work, in the EA competition they would. EA Sports is ultimately its own fantasy whereby someone good with a joystick can achieve outcomes that never reveal themselves on the field (think the Detroit Lions winning the Super Bowl), so why not craft EA Quantitative Easing in which the winners print money?  Quantitative easing is surely the policy of the adolescent, adolescents reside in a "Everyone Gets a Trophy" world, so why not give the perennial losers at the Fed a chance to actually "win" something?   

If so, the mad scientists of the economics profession who foist on us their juvenile musings about "credit creation" and growth will do so in a controlled setting whereby their immaturity will only harm them. Just as the unskilled Madden 2012 player will make the New England Patriots look bad without harming the actual Patriots, delusional economists who regularly make the simple (economics) difficult will no longer victimize all of us with their confusion.

Sadly, what's proposed here is a fantasy. Back in reality, the individuals who comprise the U.S. and global economy will continue to suffer Ben Bernanke's totally discredited solutions to what ails us. But let's not fret. Bernanke's failures are his undoing. From his naivete will eventually emerge monetary normalcy.

Sent from my iPhone

Monday, December 17, 2012


DECEMBER 17, 2012


Why do people who favor gun-control call people who disagree with them murderers or accomplices to murder? Is that constructive?

Would any of the various proposals have actually prevented the tragedy that is the supposed reason for them?

When you say you hope that this event will finally change the debate, do you really mean that you hope you can use emotionalism and blood-libel-bullying to get your way on political issues that were losers in the past?

If you're a media member or politician, do you have armed security? Do you have a permit for a gun yourself? (I'm asking you Dianne Feinstein!) If so, what makes your life more valuable than other people's?

Do you know the difference between an automatic weapon and a semi-automatic weapon? Do your public statements reflect that difference?

If guns cause murder, why have murder rates fallen as gun sales have skyrocketed?

Have you talked about "Fast and Furious?" Do you even know what it is? Do you care less when brown people die?

When you say that "we" need to change, how are you planning to change? Does your change involve any actual sacrifice on your part?

Let me know when you're ready to talk about these things. We'll have a conversation.

UPDATE: John Lucas emails:

Joe Scarborough, who claims to be a "proud NRA member" just said there is no reason to allow someone to have an "assault weapon" that shoots "30 rounds a second."

The ignorance is appalling.

Well, yes. It's MSNBC. But it is interesting that Scarborough — like Mark Shields and Rupert Murdoch — seems entirely ignorant of actual gun law. But to be fair, the National Firearms Act has only been around since 1934.

Sent from my iPhone

Sunday, December 16, 2012

The End of the University as We Know It - Nathan Harden - The American Interest Magazine

The End of the University as We Know It - Nathan Harden - The American Interest Magazine

In fifty years, if not much sooner, half of the roughly 4,500 colleges and universities now operating in the United States will have ceased to exist. The technology driving this change is already at work, and nothing can stop it. The future looks like this: Access to college-level education will be free for everyone; the residential college campus will become largely obsolete; tens of thousands of professors will lose their jobs; the bachelor's degree will become increasingly irrelevant; and ten years from now Harvard will enroll ten million students.

We've all heard plenty about the "college bubble" in recent years. Student loan debt is at an all-time high—an average of more than $23,000 per graduate by some counts—and tuition costs continue to rise at a rate far outpacing inflation, as they have for decades. Credential inflation is devaluing the college degree, making graduate degrees, and the greater debt required to pay for them, increasingly necessary for many people to maintain the standard of living they experienced growing up in their parents' homes. Students are defaulting on their loans at an unprecedented rate, too, partly a function of an economy short on entry-level professional positions. Yet, as with all bubbles, there's a persistent public belief in the value of something, and that faith in the college degree has kept demand high.

The figures are alarming, the anecdotes downright depressing. But the real story of the American higher-education bubble has little to do with individual students and their debts or employment problems. The most important part of the college bubble story—the one we will soon be hearing much more about—concerns the impending financial collapse of numerous private colleges and universities and the likely shrinkage of many public ones. And when that bubble bursts, it will end a system of higher education that, for all of its history, has been steeped in a culture of exclusivity. Then we'll see the birth of something entirely new as we accept one central and unavoidable fact: The college classroom is about to go virtual. 


e are all aware that the IT revolution is having an impact on education, but we tend to appreciate the changes in isolation, and at the margins. Very few have been able to exercise their imaginations to the point that they can perceive the systemic and structural changes ahead, and what they portend for the business models and social scripts that sustain the status quo. That is partly because the changes are threatening to many vested interests, but also partly because the human mind resists surrender to upheaval and the anxiety that tends to go with it. But resist or not, major change is coming. The live lecture will be replaced by streaming video. The administration of exams and exchange of coursework over the internet will become the norm. The push and pull of academic exchange will take place mainly in interactive online spaces, occupied by a new generation of tablet-toting, hyper-connected youth who already spend much of their lives online. Universities will extend their reach to students around the world, unbounded by geography or even by time zones. All of this will be on offer, too, at a fraction of the cost of a traditional college education. 

How do I know this will happen? Because recent history shows us that the internet is a great destroyer of any traditional business that relies on the sale of information. The internet destroyed the livelihoods of traditional stock brokers and bonds salesmen by throwing open to everyone access to the proprietary information they used to sell. The same technology enabled bankers and financiers to develop new products and methods, but, as it turned out, the experience necessary to manage it all did not keep up. Prior to the Wall Street meltdown, it seemed absurd to think that storied financial institutions like Bear Stearns and Lehman Brothers could disappear seemingly overnight. Until it happened, almost no one believed such a thing was possible. Well, get ready to see the same thing happen to a university near you, and not for entirely dissimilar reasons. 

The higher-ed business is in for a lot of pain as a new era of creative destruction produces a merciless shakeout of those institutions that adapt and prosper from those that stall and die. Meanwhile, students themselves are in for a golden age, characterized by near-universal access to the highest quality teaching and scholarship at a minimal cost. The changes ahead will ultimately bring about the most beneficial, most efficient and most equitable access to education that the world has ever seen. There is much to be gained. We may lose the gothic arches, the bespectacled lecturers, dusty books lining the walls of labyrinthine libraries—wonderful images from higher education's past. But nostalgia won't stop the unsentimental beast of progress from wreaking havoc on old ways of doing things. If a faster, cheaper way of sharing information emerges, history shows us that it will quickly supplant what came before. People will not continue to pay tens of thousands of dollars for what technology allows them to get for free.

Technology will also bring future students an array of new choices about how to build and customize their educations. Power is shifting away from selective university admissions officers into the hands of educational consumers, who will soon have their choice of attending virtually any university in the world online. This will dramatically increase competition among universities. Prestigious institutions, especially those few extremely well-endowed ones with money to buffer and finance change, will be in a position to dominate this virtual, global educational marketplace. The bottom feeders—the for-profit colleges and low-level public and non-profit colleges—will disappear or turn into the equivalent of vocational training institutes. Universities of all ranks below the very top will engage each other in an all-out war of survival. In this war, big-budget universities carrying large transactional costs stand to lose the most. Smaller, more nimble institutions with sound leadership will do best. 


his past spring, Harvard and MIT got the attention of everyone in the higher ed business when they announced a new online education venture called edX. The new venture will make online versions of the universities' courses available to a virtually unlimited number of enrollees around the world. Think of the ramifications: Now anyone in the world with an internet connection can access the kind of high-level teaching and scholarship previously available only to a select group of the best and most privileged students. It's all part of a new breed of online courses known as "massive open online courses" (MOOCs), which are poised to forever change the way students learn and universities teach.

One of the biggest barriers to the mainstreaming of online education is the common assumption that students don't learn as well with computer-based instruction as they do with in-person instruction. There's nothing like the personal touch of being in a classroom with an actual professor, says the conventional wisdom, and that's true to some extent. Clearly, online education can't be superior in all respects to the in-person experience. Nor is there any point pretending that information is the same as knowledge, and that access to information is the same as the teaching function instrumental to turning the former into the latter. But researchers at Carnegie Mellon's Open Learning Initiative, who've been experimenting with computer-based learning for years, have found that when machine-guided learning is combined with traditional classroom instruction, students can learn material in half the time. Researchers at Ithaka S+R studied two groups of students—one group that received all instruction in person, and another group that received a mixture of traditional and computer-based instruction. The two groups did equally well on tests, but those who received the computer instruction were able to learn the same amount of material in 25 percent less time.

The real value of MOOCs is their scalability. Andrew Ng, a Stanford computer science professor and co-founder of an open-source web platform called Coursera (a for-profit version of edX), got into the MOOC business after he discovered that thousands of people were following his free Stanford courses online. He wanted to capitalize on the intense demand for high-quality, open-source online courses. A normal class Ng teaches at Stanford might enroll, at most, several hundred students. But in the fall of 2011 his online course in machine learning enrolled 100,000. "To reach that many students before", Ng explained to Thomas Friedman of the New York Times, "I would have had to teach my normal Stanford class for 250 years."

Based on the popularity of the MOOC offerings online so far, we know that open-source courses at elite universities have the potential to serve enormous "classes." An early MIT online course called "Circuits and Electronics" has attracted 120,000 registrants. Top schools like Yale, MIT and Stanford have been making streaming videos and podcasts of their courses available online for years, but MOOCs go beyond this to offer a full-blown interactive experience. Students can intermingle with faculty and with each other over a kind of higher-ed social network. Streaming lectures may be accompanied by short auto-graded quizzes. Students can post questions about course material to discuss with other students. These discussions unfold across time zones, 24 hours a day. In extremely large courses, students can vote questions up or down, so that the best questions rise to the top. It's like an educational amalgam of YouTube, Wikipedia and Facebook.

Among the chattering classes in higher ed, there is an increasing sense that we have reached a tipping point where new interactive web technology, coupled with widespread access to broadband internet service and increased student comfort interacting online, will send online education mainstream. It's easy to forget that only ten years ago Facebook didn't exist. Teens now approaching college age are members of the first generation to have grown up conducting a major part of their social lives online. They are prepared to engage with professors and students online in a way their predecessors weren't, and as time passes more and more professors are comfortable with the technology, too.

In the future, the primary platform for higher education may be a third-party website, not the university itself. What is emerging is a global marketplace where courses from numerous universities are available on a single website. Students can pick and choose the best offerings from each school; the university simply uploads the content. Coursera, for example, has formed agreements with Penn, Princeton, UC Berkeley, and the University of Michigan to manage these schools' forays into online education. On the non-profit side, MIT has been the nation's leader in pioneering open-source online education through its MITx platform, which launched last December and serves as the basis for the new edX platform. 


old on there a minute, you might object. Just as information is not the same as knowledge, and auto-access is not necessarily auto-didactics, so taking a bunch of random courses does not a coherent university education make. Mere exposure, too, doesn't guarantee that knowledge has been learned. In other words, what about the justifiable function of majors and credentials?

MIT is the first elite university to offer a credential for students who complete its free, open-source online courses. (The certificate of completion requires a small fee.) For the first time, students can do more than simply watch free lectures; they can gain a marketable credential—something that could help secure a raise or a better job. While edX won't offer traditional academic credits, Harvard and MIT have announced that "certificates of mastery" will be available for those who complete the online courses and can demonstrate knowledge of course material. The arrival of credentials, backed by respected universities, eliminates one of the last remaining obstacles to the widespread adoption of low-cost online education. Since edX is open source, Harvard and MIT expect other universities to adopt the same platform and contribute their own courses. And the two universities have put $60 million of their own money behind the project, making edX the most promising MOOC venture out there right now.

Anant Agarwal, an MIT computer science professor and edX's first president, told the Los Angeles Times, "MIT's and Harvard's mission is to provide affordable education to anybody who wants it." That's a very different mission than elite schools like Harvard and MIT have had for most of their existence. These schools have long focused on educating the elite—the smartest and, often, the wealthiest students in the world. But Agarwal's statement is an indication that, at some level, these institutions realize that the scalability and economic efficiency of online education allow for a new kind of mission for elite universities. Online education is forcing elite schools to re-examine their priorities. In the future, they will educate the masses as well as the select few. The leaders of Harvard and MIT have founded edX, undoubtedly, because they realize that these changes are afoot, even if they may not yet grasp just how profound those changes will be. 

And what about the social experience that is so important to college? Students can learn as much from their peers in informal settings as they do from their professors in formal ones. After college, networking with fellow alumni can lead to valuable career opportunities. Perhaps that is why, after the launch of edX, the presidents of both Harvard and MIT emphasized that their focus would remain on the traditional residential experience. "Online education is not an enemy of residential education", said MIT president Susan Hockfield.

Yet Hockfield's statement doesn't hold true for most less wealthy universities. Harvard and MIT's multi-billion dollar endowments enable them to support a residential college system alongside the virtually free online platforms of the future, but for other universities online education poses a real threat to the residential model. Why, after all, would someone pay tens of thousands of dollars to attend Nowhere State University when he or she can attend an online version of MIT or Harvard practically for free?

This is why those middle-tier universities that have spent the past few decades spending tens or even hundreds of millions to offer students the Disneyland for Geeks experience are going to find themselves in real trouble. Along with luxury dorms and dining halls, vast athletic facilities, state of the art game rooms, theaters and student centers have come layers of staff and non-teaching administrators, all of which drives up the cost of the college degree without enhancing student learning. The biggest mistake a non-ultra-elite university could make today is to spend lavishly to expand its physical space. Buying large swaths of land and erecting vast new buildings is an investment in the past, not the future. Smart universities should be investing in online technology and positioning themselves as leaders in the new frontier of open-source education. Creating the world's premier, credentialed open online education platform would be a major achievement for any university, and it would probably cost much less than building a new luxury dorm.

Even some elite universities may find themselves in trouble in this regard, despite their capacity, as noted, to retain the residential norm. In 2007 Princeton completed construction on a new $136 million luxury dormitory for its students—all part of an effort to expand its undergraduate enrollment. Last year Yale finalized plans to build new residential dormitories at a combined cost of $600 million. The expansion will increase the size of Yale's undergraduate population by about 1,000. The project is so expensive that Yale could actually buy a three-bedroom home in New Haven for every new student it is bringing in and still save $100 million. In New York City, Columbia stirred up controversy by seizing entire blocks of Harlem by force of eminent domain for a project with a $6.3 billion price tag. Not to be outdone, Columbia's downtown neighbor, NYU, announced plans to buy up six million square feet of debt-leveraged space in one of the most expensive real estate markets in the world, at an estimated cost of $6 billion. The University of Pennsylvania has for years been expanding all over West Philadelphia like an amoeba gone real-estate insane. What these universities are doing is pure folly, akin to building a compact disc factory in the late 1990s. They are investing in a model that is on its way to obsolescence. If these universities understood the changes that lie ahead, they would be selling off real estate, not buying it—unless they prefer being landlords to being educators.

Now, because the demand for college degrees is so high (whether for good reasons or not is not the question for the moment), and because students and the parents who love them are willing to take on massive debt in order to obtain those degrees, and because the government has been eager to make student loans easier to come by, these universities and others have, so far, been able to keep on building and raising prices. But what happens when a limited supply of a sought-after commodity suddenly becomes unlimited? Prices fall. Yet here, on the cusp of a new era of online education, that is a financial reality that few American universities are prepared to face.

The era of online education presents universities with a conflict of interests—the goal of educating the public on one hand, and the goal of making money on the other. As Burck Smith, CEO of the distance-learning company StraighterLine, has written, universities have "a public-sector mandate" but "a private-sector business model." In other words, raising revenues often trumps the interests of students. Most universities charge as much for their online courses as they do for their traditional classroom courses. They treat the savings of online education as a way to boost profit margins; they don't pass those savings along to students.

One potential source of cost savings for lower-rung colleges would be to draw from open-source courses offered by elite universities. Community colleges, for instance, could effectively outsource many of their courses via MOOCs, becoming, in effect, partial downstream aggregators of others' creations, more or less like newspapers have used wire services to make up for a decline in the number of reporters. They could then serve more students with fewer faculty, saving money for themselves and students. At a time when many public universities are facing stiff budget cuts and families are struggling to pay for their kids' educations, open-source online education looks like a promising way to reduce costs and increase the quality of instruction. Unfortunately, few college administrators are keen on slashing budgets, downsizing departments or taking other difficult steps to reduce costs. The past thirty years of constant tuition hikes at U.S. universities has shown us that much. 

The biggest obstacle to the rapid adoption of low-cost, open-source education in America is that many of the stakeholders make a very handsome living off the system as is. In 2009, 36 college presidents made more than $1 million. That's in the middle of a recession, when most campuses were facing severe budget cuts. This makes them rather conservative when it comes to the politics of higher education, in sharp contrast to their usual leftwing political bias in other areas. Reforming themselves out of business by rushing to provide low- and middle-income students credentials for free via open-source courses must be the last thing on those presidents' minds.

Nevertheless, competitive online offerings from other schools will eventually force these "non-profit" institutions to embrace the online model, even if the public interest alone won't. And state governments will put pressure on public institutions to adopt the new open-source model, once politicians become aware of the comparable quality, broad access and low cost it offers.


onsidering the greater interactivity and global connectivity that future technology will afford, the gap between the online experience and the in-person experience will continue to close. For a long time now, the largest division within Harvard University has been the little-known Harvard Extension School, a degree-granting division within the Faculty of Arts and Sciences with minimal admissions standards and very low tuition that currently enrolls 13,000 students. The Extension School was founded for the egalitarian purpose of making the Harvard education available to the masses. Nevertheless, Harvard took measures to protect the exclusivity of its brand. The undergraduate degrees offered by the Extension School (Bachelor of Liberal Arts) are distinguished by name from the degrees the university awards through Harvard College (Bachelor of Arts). This model—one university, two types of degrees—offers a good template for Harvard's future, in which the old residential college model will operate parallel to the new online open-source model. The Extension School already offers more than 200 online courses for full academic credit.

Prestigious private institutions and flagship public universities will thrive in the open-source market, where students will be drawn to the schools with bigger names. This means, paradoxically, that prestigious universities, which will have the easiest time holding on to the old residential model, also have the most to gain under the new model. Elite universities that are among the first to offer robust academic programs online, with real credentials behind them, will be the winners in the coming higher-ed revolution.

There is, of course, the question of prestige, which implies selectivity. It's the primary way elite universities have distinguished themselves in the past. The harder it is to get in, the more prestigious a university appears. But limiting admissions to a select few makes little sense in the world of online education, where enrollment is no longer bounded by the number of seats in a classroom or the number of available dorm rooms. In the online world, the only concern is having enough faculty and staff on hand to review essays, or grade the tests that aren't automated, or to answer questions and monitor student progress online.

Certain valuable experiences will be lost in this new online era, as already noted. My own experience at Yale furnishes some specifics. Through its "Open Yale" initiative, Yale has been recording its lecture courses for several years now, making them available to the public free of charge. Anyone with an internet connection can go online and watch some of the same lectures I attended as a Yale undergrad. But that person won't get the social life, the long chats in the dinning hall, the feeling of collegiality, the trips around Long Island sound with the sailing team, the concerts, the iron-sharpens-iron debates around the seminar table, the rare book library, or the famous guest lecturers (although some of those events are streamed online, too). On the other hand, you can watch me and my fellow students take the stage to demonstrate a Hoplite phalanx in Donald Kagan's class on ancient Greek history. You can take a virtual seat next to me in one of Giuseppe Mazzota's unforgettable lectures on The Divine Comedy.

So while it can never duplicate the experience of a student with the good fortune to get into Yale, this is an historically significant development. Anyone who can access the internet—at a public library, for instance—no matter how poor or disadvantaged or isolated or uneducated he or she may be, can access the teachings of some of the greatest scholars of our time through open course portals. Technology is a great equalizer. Not everyone is willing or capable of taking advantage of these kinds of resources, but for those who are, the opportunity is there. As a society, we are experiencing a broadening of access to education equal in significance to the invention of the printing press, the public library or the public school.


nline education is like using online dating websites—fifteen years ago it was considered a poor substitute for the real thing, even creepy; now it's ubiquitous. Online education used to have a stigma, as if it were inherently less rigorous or less effective. Eventually for-profit colleges and public universities, which had less to lose in terms of snob appeal, led the charge in bringing online education into the mainstream. It's very common today for public universities to offer a menu of online courses to supplement traditional courses. Students can be enrolled in both types of courses simultaneously, and can sometimes even be enrolled in traditional classes at one university while taking an online course at another. 

The open-source marketplace promises to offer students additional choices in the way they build their credentials. Colleges have long placed numerous restrictions on the number of credits a student can transfer in from an outside institution. In many cases, these restrictions appear useful for little more than protecting the university's bottom line. The open-source model will offer much more flexibility, though still maintain the structure of a major en route to obtaining a credential. Students who aren't interested in pursuing a traditional four-year degree, or in having any major at all, will be able to earn meaningful credentials one class at a time.

To borrow an analogy from the music industry, universities have previously sold education in an "album" package—the four-year bachelor's degree in a certain major, usually coupled with a core curriculum. The trend for the future will be more compact, targeted educational certificates and credits, which students will be able to pick and choose from to create their own academic portfolios. Take a math class from MIT, an engineering class from Purdue, perhaps with a course in environmental law from Yale, and create interdisciplinary education targeted to one's own interests and career goals. Employers will be able to identify students who have done well in specific courses that match their needs. When people submit résumés to potential employers, they could include a list of these individual courses, and their achievement in them, rather than simply reference a degree and overall GPA. The legitimacy of MOOCs in the eyes of employers will grow, then, as respected universities take the lead in offering open courses with meaningful credentials.

MOOCs will also be a great remedy to the increasing need for continuing education. It's worth noting that while the four-year residential experience is what many of us picture when we think of "college", the residential college experience has already become an experience only a minority of the nation's students enjoy. Adult returning students now make up a large mass of those attending university. Non-traditional students make up 40 percent of all college students. Together with commuting students, or others taking classes online, they show that the traditional residential college experience is something many students either can't afford or want. The for-profit colleges, which often cater to working adult students with a combination of night and weekend classes and online coursework, have tapped into the massive demand for practical and customized education. It's a sign of what is to come.


hat about the destruction these changes will cause? Think again of the music industry analogy. Today, when you drive down music row in Nashville, a street formerly dominated by the offices of record labels and music publishing companies, you see a lot of empty buildings and rental signs. The contraction in the music industry has been relentless since the Mp3 and the iPod emerged. This isn't just because piracy is easier now; it's also because consumers have been given, for the first time, the opportunity to break the album down into individual songs. They can purchase the one or two songs they want and leave the rest. Higher education is about to become like that. 

For nearly a thousand years the university system has looked just about the same: professors, classrooms, students in chairs. The lecture and the library have been at the center of it all. At its best, traditional classroom education offers the chance for intelligent and enthusiastic students to engage a professor and one another in debate and dialogue. But typical American college education rarely lives up to this ideal. Deep engagement with texts and passionate learning aren't the prevailing characteristics of most college classrooms today anyway. More common are grade inflation, poor student discipline, and apathetic teachers rubber-stamping students just to keep them paying tuition for one more term. 

If you ask students what they value most about the residential college experience, they'll often speak of the unique social experience it provides: the chance to live among one's peers and practice being independent in a sheltered environment, where many of life's daily necessities like cooking and cleaning are taken care of. It's not unlike what summer camp does at an earlier age. For some, college offers the chance to form meaningful friendships and explore unique extracurricular activities. Then, of course, there are the Animal House parties and hookups, which do take their toll: In their research for their book Academically Adrift, Richard Arum and Josipa Roksa found that 45 percent of the students they surveyed said they had no significant gains in knowledge after two years of college. Consider the possibility that, for the average student, traditional in-classroom university education has proven so ineffective that an online setting could scarcely be worse. But to recognize that would require unvarnished honesty about the present state of play. That's highly unlikely, especially coming from present university incumbents. 

The open-source educational marketplace will give everyone access to the best universities in the world. This will inevitably spell disaster for colleges and universities that are perceived as second rate. Likewise, the most popular professors will enjoy massive influence as they teach vast global courses with registrants numbering in the hundreds of thousands (even though "most popular" may well equate to most entertaining rather than to most rigorous). Meanwhile, professors who are less popular, even if they are better but more demanding instructors, will be squeezed out. Fair or not, a reduction in the number of faculty needed to teach the world's students will result. For this reason, pursuing a Ph.D. in the liberal arts is one of the riskiest career moves one could make today. Because much of the teaching work can be scaled, automated or even duplicated by recording and replaying the same lecture over and over again on video, demand for instructors will decline. 

Who, then, will do all the research that we rely on universities to do if campuses shrink and the number of full-time faculty diminishes? And how will important research be funded? The news here is not necessarily bad, either: Large numbers of very intelligent and well-trained people may be freed up from teaching to do more of their own research and writing. A lot of top-notch research scientists and mathematicians are terrible teachers anyway. Grant-givers and universities with large endowments will bear a special responsibility to make sure important research continues, but the new environment in higher ed should actually help them to do that. Clearly some kinds of education, such as training heart surgeons, will always require a significant amount of in-person instruction.

Big changes are coming, and old attitudes and business models are set to collapse as new ones rise. Few who will be affected by the changes ahead are aware of what's coming. Severe financial contraction in the higher-ed industry is on the way, and for many this will spell hard times both financially and personally. But if our goal is educating as many students as possible, as well as possible, as affordably as possible, then the end of the university as we know it is nothing to fear. Indeed, it's something to celebrate. 

Sent from my iPhone

Wednesday, December 12, 2012

The robot economy and the new rentier class | FT Alphaville

The robot economy and the new rentier class

It seems more top-tier economists are coming around to the idea that robots and technology could be having a greater influence on the economy (and this crisis in particular) than previously appreciated. Paul Krugman being the latest.

But first a quick backgrounder on the debate so far (as tracked by us).

Probably the first high-profile advocate of the idea — in recent times — that "technology and computers were changing the economy in weird ways" was Alan Greenspan in the 1990s, when he attributed a mysterious lack of inflation, high productivity and low unemployment rate to the arrival of a technologically rich "New Economy".

As we've written before, once the tech bubble burst — and Greenspan was supposedly proved so very wrong — the whole idea of technology being a fundamental force in the real economy was abandoned. This is well illustrated by the sudden fall in references to technology in FOMC meetings (as tracked by us):

Apart from a few fringe voices, the technology factor — and its likely effect on the natural unemployment rate as society moves towards a more leisure-focused framework, since all the hard jobs are done by robots and computers — became victim to a deathly silence in the world of serious economic thinking.

Indeed, when we first started considering the idea that technology could be behind the move to zero yields — with the crisis a function of technology shifts than anything else (especially if you follow the Keynesian view that one day a leisure economy becomes inevitable) — there was barely anyone out there to cite on the matter, apart from the Skidelskys and advocates of the Singularity movement.

There has been more commentary since then. George Magnus at UBS, for example, wrote a noteworthy piece in September.

But there has also been commentary to the contrary. Most notably there's the view set out by Robert Gordon (and Peter Thiel) that the crisis was a function of a lack of innovation and technology.

This concept caught the imagination of a lot of people, bringing technology's influence back to the forefront, while also reviving the whole idea of "limits to growth" and us being near that limit point.

Harvard's Ken Rogoff recently debated this point of view with both Thiel and Gordon, but seemed to arrive at a different conclusion. As his op-ed set out last week:

There are certainly those who believe that the wellsprings of science are running dry, and that, when one looks closely, the latest gadgets and ideas driving global commerce are essentially derivative. But the vast majority of my scientist colleagues at top universities seem awfully excited about their projects in nanotechnology, neuroscience, and energy, among other cutting-edge fields. They think they are changing the world at a pace as rapid as we have ever seen. Frankly, when I think of stagnating innovation as an economist, I worry about how overweening monopolies stifle ideas, and how recent changes extending the validity of patents have exacerbated this problem.

We feel this is a hugely important point. For what Rogoff is saying is that if we are experiencing technology stagnation, it's not because humanity has suddenly become less innovative. Rather, it's because incumbent interests now have the biggest incentive ever to impose artificial scarcity, which is stopping the speed of innovation.

Our own personal view is that this is because we've now arrived at a point where technology begins to threaten return on capital, mostly by causing the sort of abundance that depresses prices to the point where many goods have no choice but to become free. This is related to the amount of "free working" hours now being pumped into the economy — the result of crowd sourcing and rising productivity levels — thanks, in part, to the sort of gadgets that allow everyone to work anywhere and anytime, in a work environment that's generally speeding up as everyone tries to keep up with the competition by doing yet more hours voluntarily.

Patent wars, meanwhile… and the rise of companies whose entire raison d'etre is focused on protecting patents… is the ultimate counter force. As a recent Fed paper spelled out, there is real evidence to suggest that idea monopolisation has become a hugely counter-productive force in the economy.

We particularly enjoyed this opinion piece by Steven Levy at Wired Magazine on what he described as the emerging "patent problem".

As he explained:

The flaws of the patent system are most vividly exposed by the rise of trolls. The term, inspired by the stunted opportunists of myth, came from an Intel vice president who had been sued for calling a lawyer a "patent extortionist" and needed another expression. It refers to a company that doesn't make products but exists solely on the revenue of its patents. In the parlance of today's patent ecosystem, trolls are known as nonpracticing entities, or NPEs.

The rise of the patent troll effect, meanwhile, is well illustrated by the following Wired graphic:

Which brings us neatly to the latest offering on the technology factor, this time from Paul Krugman — who seems to have spent a large portion of the week thinking about the issue, with no less than two robot-themed postings.

But it's his last one which presents the monopolisation effect best, as he considers what's driving the share of non-farm business sector output downwards so rapidly:

But there's another possible resolution: monopoly power. Barry Lynn and Philip Longman have argued that we're seeing a rapid rise in market concentration and market power. The thing about market power is that it could simultaneously raise the average rents to capital and reduce the return on investment as perceived by corporations, which would now take into account the negative effects of capacity growth on their markups. So a rising-monopoly-power story would be one way to resolve the seeming paradox of rapidly rising profits and low real interest rates.

In our opinion that one paragraph explains today's reality perfectly.

So, robot and technology power is reducing the natural employment rate. But rather than our subsidising those who have lost jobs to technology, so as to spread that manna wealth that's literally dropped onto the surface of the earth at no-one's physical disadvantage, companies are using monopoly power to extort rents on the capital that is creating all that free wealth.

That's why inequality is rising.

As technology proceeds in a patent-obsessed world, the fruits of innovation flow to the owners of the capital and invention, forming a whole new rentier class. The financial assets/debts that back the innovation technology, meanwhile, get disproportionally valuable as their purchasing power gets completely out of whack with the output they radically accelerate.

If you think about it, inequality is always going to be the natural consequence of a technologically-driven deflationary environment. Whereas in inflation, those with financial claims (a.k.a money) are impoverished as their purchasing power is eroded, while those in debt are enriched — in deflation, those with financial claims (the result of increasing rentier flows, if Krugman's point is valid) become enriched as those in debt become increasingly impoverished.

In that sense QE and any move to "debase" financial claims is a move to dilute the wealth effect on legacy claims, which now claim a disproportionate share of available output, at least compared to what they did when they were created.

Low interest rates in many ways are thus only self-correction mechanism bringing the system back to balance — trying to offset the growing power of the innovation-based capital rentier class.

In that context it's understandable that the older the claim, the more preferable it is to hoard it, since the greater its claim over today's output. And in an environment where such claims are self-correcting anyway — via capital destruction brought on by negative rates, as people rush to invest in anything that gives them disproportional access to output and thus crowd each other out — that some of the rentier class see it logical to hoard in non-perishable assets "which cannot be debased" instead is an understandable consequence.

Related links:
Robots! No Robots! – FT Alphaville
Ahhhh! No robots! – FT Alphaville
The Patent Problem – Wired
Whose idea is it anyway – Towards a Leisure Society
Beyond Scarcity – FT Alphaville (series)
Peter Diamandis: Abundance is our future – Ted Talks

Sent from my iPhone

Monday, December 3, 2012

In Entitlement America, The Head Of A Household Of Four Making Minimum Wage Has More Disposable Income Than A Family Making $60,000 A Year | ZeroHedge

In Entitlement America, The Head Of A Household Of Four Making Minimum Wage Has More Disposable Income Than A Family Making $60,000 A Year

Tonight's stunning financial piece de resistance comes from Wyatt Emerich of The Cleveland Current. In what is sure to inspire some serious ire among all those who once believed Ronald Reagan that it was the USSR that was the "Evil Empire", Emmerich analyzes disposable income and economic benefits among several key income classes and comes to the stunning (and verifiable) conclusion that "a one-parent family of three making $14,500 a year (minimum wage) has more disposable income than a family making $60,000 a year." And that excludes benefits from Supplemental Security Income disability checks. America is now a country which punishes those middle-class people who not only try to work hard, but avoid scamming the system. Not surprisingly, it is not only the richest and most audacious thieves that prosper - it is also the penny scammers at the very bottom of the economic ladder that rip off the middle class each and every day, courtesy of the world's most generous entitlement system. Perhaps if Reagan were alive today, he would wish to modify the object of his once legendary remark.

From Emmerich:

You can do as well working one week a month at minimum wage as you can working $60,000-a-year, full-time, high-stress job.

My chart tells the story. It is pretty much self-explanatory.

Stunning? Just do it yourself.

Almost all welfare programs have Web sites where you can call up "benefits calculators." Just plug in your income and family size and, presto, your benefits are automatically calculated.

The chart is quite revealing. A one-parent family of three making $14,500 a year (minimu wage) has more disposable income than a amily making $60,000 a year.

And if that wasn't enough, here is one that will blow your mind:

If the family provider works only one week a month at minimum wage, he or she makes 92 percent as much as a provider grossing $60,000 a year.

Ever wonder why Obama was so focused on health reform? It is so those who have no interest or ability in working, make as much as representatives of America's once exalted, and now merely endangered, middle class.

First of all, working one week a month, saves big-time on child care. But the real big-ticket item is Medicaid, which has minimal deductibles and copays. By working only one week a month at a minimum wage job, a provider is able to get total medical coverage for next to nothing.

Compare this to the family provider making $60,000 a year. A typical Mississippi family coverage would cost around $12,000, adding deductibles and copays adds an additional $4,500 or so to the bill. That's a huge hit.

There is a reason why a full time worker may not be too excited to learn there is little to show for doing the "right thing."

The full-time $60,000-a-year job is going to be much more demanding than woring one week a month at minimu wage. Presumably, the low-income parent will have more energy to attend to the various stresses of managing a household.

It gets even scarier if one assumes a little dishonesty is throwin in the equation.

If the one-week-a-month worker maintains an unreported cash-only job on the side, the deal gets better than a regular $60,000-a-year job.  In this scenario, you maintain a reportable, payroll deductible, low-income job for federal tax purposes. This allows you to easily establish your qualification for all these welfare programs. Then your black-market job gives you additional cash without interfering with your benefits. Some economists estimate there is one trillion in unreported income each year in the United States.

This really got me thinking. Just how much money could I get if I set out to deliberately scam the system? I soon realized that getting a low-paying minimum wage job would set the stage for far more welfare benefits than you could earn in a real job, if you were weilling to cheat. Even if you dodn't cheat, you could do almost as well working one week a month at minimum wage than busting a gut at a $60,000-a-year job. 

Now where it gets plainly out of control is if one throws in Supplemental Security Income.

SSI pays $8,088 per year for each "disabled" family member. A person can be deemed "disabled" if thy are totally lacking in the cultural and educational skills needed to be employable in the workforce.

If you add $24,262 a year for three disability checks, the lowest paid welfare family would now have far more take-home income than the $60,000-a-year family.

Best of all: being on welfare does not judge you if you are stupid enough not to take drugs all day, every day to make some sense out of this Mephistophelian tragicomedy known as living in the USA:

Most private workplaces require drug testing, but there is no drug testing to get welfare checks.

Alas, on America's way to to communist welfare, it has long since surpassed such bastions of capitalism as China:

The welfare system in communist China is far stringier. Those people have to work to eat.

We have been writing for over a year, how the very top of America's social order steals from the middle class each and every day. Now we finally know that the very bottom of the entitlement food chain also makes out like a bandit compared to that idiot American who actually works and pays their taxes. One can only also hope that in addition to seeing their disposable income be eaten away by a kleptocratic entitlement state, that the disappearing middle class is also selling off its weaponry. Because if it isn't, and if it finally decides it has had enough, the outcome will not be surprising at all: it will be the same old that has occurred in virtually every revolution in the history of the world to date.

h/t Nolsgrad

Sent from my iPhone

Is This Why Americans Have Lost The Drive To "Earn" More | ZeroHedge

Is This Why Americans Have Lost The Drive To "Earn" More

In the recent past we noted the somewhat startling reality that "the single mom is better off earning gross income of $29,000 with $57,327 in net income & benefits than to earn gross income of $69,000 with net income and benefits of $57,045." While mathematics is our tool - as opposed to the mathemagics of some of the more politically biased media who did not like our message - the painful reality in America is that: for increasingly more Americans it is now more lucrative - in the form of actual disposable income - to sit, do nothing, and collect various welfare entitlements, than to work. This is such an important topic that we felt it necessary to warrant a second look. The graphic below quite clearly, and very painfully, confirms that there is an earnings vacuum of around $40k in which US workers are perfectly ambivalent toward inputting more effort since it does not result in any additional incremental disposable income. With the ongoing 'fiscal cliff' battles over taxes and entitlements, this is a problematic finding, since - as a result - it is the US government that will have to keep funding indirectly this lost productivity and worker output (via wealth redistribution).


As we noted before (details below):

We realize that this is a painful topic in a country in which the issue of welfare benefits, and cutting (or not) the spending side of the fiscal cliff, have become the two most sensitive social topics. Alas, none of that changes the matrix of incentives for most Americans who find themselves in a comparable situation: either being on the left side of minimum US wage, and relying on benefits, or move to the right side at far greater personal investment of work, and energy, and... have the same disposable income at the end of the day.

Naturally, the topic of wealth redistribution is paramount one now that America is entering the terminal phase of its out of control spending, and whose response to hike taxes in a globalized, easily fungible world, will merely force more of the uber-wealthy to find offshore tax jurisdictions, avoid US taxation altogether, and thus result in even lower budget revenues for the US. It explains why the cluelessly incompetent but supposedly impartial Congressional Budget Office just released a key paper titled "Share of Returns Filed by Low- and Moderate-Income Workers, by Marginal Tax Rate, Under 2012 Law" which carries a chart of disposable income by net income comparable to the one above.

But perhaps the scariest chart in the entire presentation is the following summarizing the unsustainable welfare burden on current taxpayers:

  • For every 1.65 employed persons in the private sector, 1 person receives welfare assistance
  • For every 1.25 employed persons in the private sector, 1 person receives welfare assistance or works for the government.

The punchline: 110 million privately employed workers; 88 million welfare recipients and government workers and rising rapidly.

And since nothing has changed in the past two years, and in fact the situation has gotten progressively (pardon the pun) worse, here is our conclusion on this topic from two years ago:

We have been writing for over a year, how the very top of America's social order steals from the middle class each and every day. Now we finally know that the very bottom of the entitlement food chain also makes out like a bandit compared to that idiot American who actually works and pays their taxes. One can only also hope that in addition to seeing their disposable income be eaten away by a kleptocratic entitlement state, that the disappearing middle class is also selling off its weaponry. Because if it isn't, and if it finally decides it has had enough, the outcome will not be surprising at all: it will be the same old that has occurred in virtually every revolution in the history of the world to date.

But for now, just stick head in sand, and pretend all is good. Self-deception is now the only thing left for the entire insolvent entitlement-addicted world.

* * *

Full must read presentation: "Welfare's Failure and the Solution"


Some other thoughts on this topic: DOES IT PAY, AT THE MARGIN, TO WORK AND SAVE?

Your rating: None Average: 4.4 (35 votes)

Sent from my iPhone