Archive for the ‘Uncategorized’ Category

Scandinavia Now

Wednesday, January 14th, 2015

A somewhat tongue-in-cheek — somewhat, but not completely — high-level overview of the homeland I used to want to visit, with most emphasis on Denmark and some held in reserve for Sweden, in the
NY Post (by way of Bird Dog at Maggie’s Farm).

Let’s look a little closer, suggests Michael Booth, a Brit who has lived in Denmark for many years, in his new book, “The Almost Nearly Perfect People: Behind the Myth of the Scandinavian Utopia” (Picador).

Those sky-high happiness surveys, it turns out, are mostly bunk. Asking people “Are you happy?” means different things in different cultures. In Japan, for instance, answering “Yes” seems like boasting, Booth points out. Whereas in Denmark, it’s considered “shameful to be unhappy,” newspaper editor Anne Knudsen says in the book.

Moreover, there is a group of people that believes the Danes are lying when they say they’re the happiest people on the planet. This group is known as “Danes.”
:
An American woman told Booth how, when she excitedly mentioned at a dinner party that her kid was first in his class at school, she was met with icy silence.

One of the most country’s most widely known quirks is a satirist’s crafting of what’s still known as the Jante Law — the Ten Commandments of Buzzkill. “You shall not believe that you are someone,” goes one. “You shall not believe that you are as good as we are,” is another. Others included “You shall not believe that you are going to amount to anything,” “You shall not believe that you are more important than we are” and “You shall not laugh at us.”
:
In addition to paying enormous taxes — the total bill is 58 percent to 72 percent of income — Danes have to pay more for just about everything. Books are a luxury item. Their equivalent of the George Washington Bridge costs $45 to cross. Health care is free — which means you pay in time instead of money. Services are distributed only after endless stays in waiting rooms. Pharmacies are a state-run monopoly, which means getting an aspirin is like a trip to the DMV.
:
Scandinavia, as a wag in The Economist once put it, is a great place to be born — but only if you are average. The dead-on satire of Scandinavian mores “Together” is a 2000 movie by Sweden’s Lukas Moodysson set in a multi-family commune in 1975, when the groovy Social Democratic ideal was utterly unquestioned in Sweden.

In the film’s signature scene, a sensitive, apron-wearing man tells his niece and nephew as he is making breakfast, “You could say that we are like porridge. First we’re like small oat flakes — small, dry, fragile, alone. But then we’re cooked with the other oat flakes and become soft. We join so that one flake can’t be told apart from another. We’re almost dissolved. Together we become a big porridge that’s warm, tasty, and nutritious and yes, quite beautiful, too. So we are no longer small and isolated but we have become warm, soft and joined together. Part of something bigger than ourselves. Sometimes life feels like an enormous porridge, don’t you think?”

Then he spoons a great glutinous glob of tasteless starch onto the poor kids’ plates. That’s Scandinavia for you, folks: Bland, wholesome, individual-erasing mush. But, hey, at least we’re all united in being slowly digested by the system.

The News Junkie follows along, and adds a link:

Naturally enough, the architecture of the welfare state was designed and developed with European realities in mind, the most important of which were European beliefs about poverty. Thanks to their history of Old World feudalism, with its centuries of rigid class barriers and attendant lack of opportunity for mobility based on merit, Europeans held a powerful, continentally pervasive belief that ordinary people who found themselves in poverty or need were effectively stuck in it — and, no less important, that they were stuck through no fault of their own, but rather by an accident of birth. The state provision of old-age pensions, unemployment benefits, and health services — along with official family support and other household-income guarantees — served a multiplicity of purposes for European political economies, not the least of which was to assuage voters’ discontent with the perceived shortcomings of their countries’ social structures through a highly visible and explicitly political mechanism for broadly based and compensatory income redistribution.

But America’s historical experience has been rather different from Europe’s, and from the earliest days of the great American experiment, people in the United States exhibited strikingly different views from their trans-Atlantic cousins on the questions of poverty and social welfare. These differences were noted both by Americans themselves and by foreign visitors, not least among them Alexis de Tocqueville, whose conception of American exceptionalism was heavily influenced by the distinctive American worldview on such matters. Because America had no feudal past and no lingering aristocracy, poverty was not viewed as the result of an unalterable accident of birth but instead as a temporary challenge that could be overcome with determination and character — with enterprise, hard work, and grit. Rightly or wrongly, Americans viewed themselves as masters of their own fate, intensely proud because they were self-reliant.

To the American mind, poverty could never be regarded as a permanent condition for anyone in any stratum of society because of the country’s boundless possibilities for individual self-advancement. Self-reliance and personal initiative were, in this way of thinking, the critical factors in staying out of need. Generosity, too, was very much a part of that American ethos; the American impulse to lend a hand (sometimes a very generous hand) to neighbors in need of help was ingrained in the immigrant and settler traditions. But thanks to a strong underlying streak of Puritanism, Americans reflexively parsed the needy into two categories: what came to be called the deserving and the undeserving poor. To assist the former, the American prescription was community-based charity from its famously vibrant “voluntary associations.” The latter — men and women judged responsible for their own dire circumstances due to laziness, or drinking problems, or other behavior associated with flawed character — were seen as mainly needing assistance in “changing their ways.” In either case, charitable aid was typically envisioned as a temporary intervention to help good people get through a bad spell and back on their feet. Long-term dependence upon handouts was “pauperism,” an odious condition no self-respecting American would readily accept.

Right, the local widow and the town drunk. Both without means, one through no fault of her own, after a lifetime of doing what she was supposed to do; the other one impoverished by choice.

Suffice it to say, the United States arrived late to the 20th century’s entitlement party, and the hesitance to embrace the welfare state lingered on well after the Depression. As recently as the early 1960s, the “footprint” left on America’s GDP by the welfare state was not dramatically larger than it had been under Franklin Roosevelt — or Herbert Hoover, for that matter. In 1961, at the start of the Kennedy Administration, total government entitlement transfers to individual recipients accounted for a little less than 5% of GDP, as opposed to 2.5% of GDP in 1931 just before the New Deal. In 1963 – the year of Kennedy’s assassination – these entitlement transfers accounted for about 6% of total personal income in America, as against a bit less than 4% in 1936.

During the 1960s, however, America’s traditional aversion to the welfare state and all its works largely collapsed. President Johnson’s “War on Poverty” and his “Great Society” pledge of the same year ushered in a new era for America, in which Washington finally commenced in earnest the construction of a massive welfare state. In the decades that followed, America not only markedly expanded provision for current or past workers who qualified for benefits under existing “social insurance” arrangements, it also inaugurated a panoply of nationwide programs for “income maintenance” (food stamps, housing subsidies, Supplemental Social Security Insurance, and the like) where eligibility turned not on work history but on officially designated “poverty” status. The government also added health-care guarantees for retirees and the officially poor, with Medicare, Medicaid, and their accompaniments. In other words, Americans could claim, and obtain, an increasing trove of economic benefits from the government simply by dint of being a citizen; they were now incontestably entitled under law to some measure of transferred public bounty, thanks to our new “entitlement state.”

The expansion of the American welfare state remains very much a work in progress; the latest addition to that edifice is, of course, the Affordable Care Act. Despite its recent decades of rapid growth, the American welfare state may still look modest in scope and scale compared to some of its European counterparts. Nonetheless, over the past two generations, the remarkable growth of the entitlement state has radically transformed both the American government and the American way of life itself. It is not too much to call those changes revolutionary.
:
By 2012, the most recent year for such figures at this writing, Census Bureau estimates indicated that more than 150 million Americans, or a little more than 49% of the population, lived in households that received at least one entitlement benefit. Since under-reporting of government transfers is characteristic for survey respondents, and since administrative records suggest the Census Bureau’s own adjustments and corrections do not completely compensate for the under-reporting problem, this likely means that America has already passed the symbolic threshold where a majority of the population is asking for, and accepting, welfare-state transfers.

It’s sad when a passion dies. My desire to visit the “fatherland” has evaporated for the most part, and I note that any need to do so seems to have vanished along with it. We’re all cooked now; what’s the necessity involved in visiting from one bowl of soft slimy porridge, to another?

What’s the solution? There has to be some sort of “uncooking.” That seems unrealistic when one views it from the porridge analogy. Which sadly fits, because once a grain of barley has been cooked and softened it’s no simple matter to get it firm and “flaky” again.

But, the problem was created by way of an errant, extremist, and therefore fragile, mindset. We imported from Europe the mindset that, when a woman at a dinner table shows pride in her son won first place in his school, she should be scorned. And, that a “fundamental transformation,” to coin a phrase, into a welfare state is some sort of laudable ambition for a head of that state to hold. So topsy-turvy is this Weltanschauung that the quickest way to get to it is to see any object or concept connected with it as the opposite of what it truly is. Excellence is to be mocked, independence scorned the way we are supposed to scorn crime. Reliance on public assistance is somehow the dream of a lifetime…and on and on down the line. Such a way of looking at things cannot endure without a lot of support, from within and without. Reality will not offer a helping hand. The solution to the problem is in there. Somewhere.

The point to all this, in my mind, is: The “victimology” complex, the “I’m poor because I was born that way” thing, is an offshoot of aristocratic stratification that existed over there, but never over here. We don’t have any business clinging to it, because we don’t have the historical underpinnings to support it. It’s been said that in America, you can be anything you want to be. That isn’t just a bumper-sticker slogan. It will never be reduced to an empty rhetorical nugget, some sort of laughable nullity, unless we allow it to be.

“If There Was Ever a Picture That Says ‘Our Policy is Leading From Behind,’ That’s It”

Monday, January 12th, 2015

Hot Air:

Organizers in France estimated that upwards of 3.7 million people attended a Sunday unity rally in Paris in response to the attack on the satirical weekly Charlie Hebdo. Among those marchers were a variety of world leaders, but President Barack Obama was not among them. Nor did the president send any ranking administration officials to represent the United States. Not even Attorney General Eric Holder, who was in Paris at the request of French authorities, attended the march. The United States was utterly absent from this global event.

EmbarassmentObama did not even bother to attend a solidarity march for Paris that was held in Washington D.C. yesterday despite the participation of American officials like the State Department’s Victoria Nuland. “Obama wasn’t far from the march in D.C. on Sunday that wended silently along six blocks from the Newseum to the National Law Enforcement Officers Memorial,” Politico reported. “Instead, he spent the chilly afternoon a few blocks away at the White House, with no public schedule, no outings.”
:
“I say this as an American — not as a journalist, not as a representative of CNN — but as an American: I was ashamed,” CNN anchor Jake Tapper wrote. He noted that it was an oversight of the first order that no prominent 2016 prospect, Republican or Democratic, chose to join the Parisian marchers either.

“You let the world down,” read the front page of the New York Daily News on Monday.

Even the often staid and demure participants on CNN’s politics panel on the morning show New Day were animated over the absence of America from this pivotal symbolic event.

“If there was ever a picture that says ‘Our policy is leading from behind,’ that’s it,” said CNN anchor John King as he showed video of a variety of world leaders locking arms while marching in solidarity with average French citizens.

From the comments:

Reminder…Obama sent three, THREE, representatives to Michael Brown’s funeral.

Our Cold Civil War

Sunday, January 11th, 2015

My stock phrase I have been molding & shaping over the last couple years or so, although there has been some truth to it since around the early nineties, give-or-take…

Our “civilization” at the moment…is embroiled in a cold civil war…between people who refuse to define things, and people who MUST see to it that things are strongly defined before they can do what they do.

That’s a pretty simplimified summary, but lately I’m thinking even that tiny, heavy nugget may be overly complexificated.

The cold civil war is between people who must resist strong definitions before they can do the things they do, and the people who rely on the strong definitions to do what they do.

The technology that enables us to communicate with large numbers of our fellows rapidly has introduced, as it always has and always shall, new dynamics into this conflict. To do the things you cannot do unless things are left undefined, you have to get them sold, and that means we have ideas being exchanged that are not salable unless things are left undefined. The same goes for things that rely on the stronger definitions, they rely on the sale of ideas that in turn rely on these strong definitions being suggested, accepted, and then enforced.

As a general rule, you need to keep things undefined in order to do things that only look like they help people but don’t actually help anybody — except parasites. To do things that really do help people, before the mission is accomplished you’ll need to define something. “Is it a good idea to drive a three ton truck across a bridge that supports five thousand pounds?” is not a question you’d want to answer, without a good understanding of how many pounds are in a ton.

Definition, by its very nature, relies on shared and verifiable understanding.

Who Says Michelle Jenneke’s Not a Real Athlete?

Sunday, January 11th, 2015

Well, actually they did answer that question…it’s one guy.

I don’t know if it’s any more than that one dumb guy in Ms. Jenneke’s case, but the mindset is certainly out there that if a woman is highly accomplished in the looks department she can’t be highly accomplished anywhere else. I find this to be not only idiotic, but as “sexist” as anything can possibly be. Certainly more than anything else we like to go around labeling with that word these days. I mean, what could be worse? Oh, she’s good-looking…well then, keep her out of the Olympics, or any other athletic competition, or anything that isn’t strictly fixated with looks. She wants to be a model or a beauty contest champion, make sure she isn’t anything else.

Because you have to be frumpy and pear-shaped to be a winning hurdler? That is literally not going to fly. You have to be fit. And — I think, this is where the problem really is — fit people are going to be better-looking.

There’s a tennis player named Anna Kournikova who once started going around making a higher profile for herself, wearing attention-getting clothes both on & off the courts, posing for photoshoots, and so on and so forth. Then I recall someone somewhere pointed out she got beat here, there, and a bunch of other places…this is a little outside my base of knowledge, but it seemed there was a good deal of evidence provided that she wasn’t actually all that good of a tennis player. Although I’m sure she’s better than I am. But, no one that I can recall ever questioned whether she was a “serious athlete.” Nor should they have, if she was still picking up sponsorships, still playing well enough to at least compete.

That’s Anna K. Now Michelle Jenneke, and my base of knowledge is limited here as well — she’s winning quite a lot of the time at what she does, isn’t she? The one-critic doesn’t seem to be asserting otherwise.

What more do you need to do to be a serious athlete, other than be a serious athlete? I’m missing something here. Or, I hope I am anyway. You have to be a butterface or a frump-a-dump to be a serious athlete now?

Now if he wants to say the music in Michelle Jenneke’s video is not real music, well then…that has the potential to be a different conversation entirely.

Seventies Fashion

Sunday, January 11th, 2015

Every time I look at it I think the same thing…

…that against all odds, we survived this decade. And Jimmy Carter too, at the same time.

It fills me with hope. If we can survive that, we can survive anything.

Memo For File CXCII

Saturday, January 10th, 2015

Bill Maher’s use of the word “liberal” in this clip — which is an unorthodox use, although it should not be — further stimulated some thoughts I’ve been having about the proper role of government over the last few months. Although the original ignition point came when I saw the very first blows being traded between establishment Republicans and the Tea Party types, in what is surely a ramp-up to the 2016 elections.

Continuing this thinking a bit further, I took to the Hello Kitty of Blogging and pointed out

“Get (and keep) religion out of government.” Sounds good, but I have a question: How come it continues to be necessary for someone to say so? What is this force that blends religion together with government?

My observation is that religion, practiced the way it should be practiced, really hasn’t got anything to do with government, practiced the way IT should be practiced. The two are literally about two different worlds. This imbroglio about same-sex marriage is a perfect example of what I’m noticing: What happened, that we now need to be concerned with how government defines marriage? Something. It didn’t start out that way. And the things that happened, were not good things.

And my conclusion is: Government, practiced the way it should NOT be practiced, is exactly the same as religion, practiced the way IT should not be practiced. Bad religion and bad government share the same goal, differing only in tactics, and this is what blends them together: To socially elevate a targeted class, clique or individual, above everyone else.

Is it fair to shoehorn these two visions into the terms “left wing” and “right wing,” at least in the United States? Having read all of the opinions available to me about it, including the Wikipedia entry, I’ve come to the conclusion: Yeah, sure, whatever. There is no contradictory definition that’s actually stuck.

The terms left-wing and right-wing are widely used in the United States but, as on the global level, there is no firm consensus about their meaning. The only aspect that is generally agreed upon is that they are the defining opposites of the United States political spectrum. Left and right in the U.S. are generally associated with liberal and conservative respectively, although the meanings of the two sets of terms do not entirely coincide. Depending on the political affiliation of the individual using them, these terms can be spoken with varying implications.
:
In general, the term left-wing is understood to imply a commitment to egalitarianism, support for social policies that favor the working class, and multiculturalism…
:
In general, right-wing implies a commitment to conservative Christian values, support for a free-market system, and traditional family values…

This is mostly, consistent with the understanding that “left wing” involves an elevation of selected persons and classes, whereas “right wing” does not — if anything, it involves an elevation of certain actions and a derogation of certain other actions.

The hitch in the giddyup is this business about the left-wing and “a commitment to egalitarianism.” It isn’t hard to resolve this, though: It is a promise on which they haven’t been delivering. In fact, you’ll find when left-wingers accuse right-wingers of this non-egalitarian vision, the inequality, the discrimination, the “ism” — if you take the time to really look into it, you’ll find the right-winger is being accused of “discrimination” because he isn’t discriminating the way the left-winger wants him to discriminate.

So when all’s said & done, the distinction holds. Left-wing policies, in achievement as well as in intent, foment inequality, castes, and special privileges. To the extent “right wing” means anything at all in the US of A in Anno Domini Twenty Fifteen, it is a mild to severe reactionary refusal to recognize these castes. No thank you, I don’t think I want Kathleen Sebelius making my health care decisions for me. No, I don’t want to buy carbon offset vouchers and send my money into some black hole to be managed by perfect strangers when I light my house. No, actually, I have listened to the “experts” on “global warming” and I’ve concluded they’re full of crap; no, I think I’d prefer not to forget about all the failed predictions they’ve made.

There are two visions for government here and they’re both quite old. In fact, one of the things that has impressed me the most about history of civilizations, especially recent, industrial-age history, is that governments tend to do this shift over time from the one, to the other. Ours is no different. They start off providing the minimal essentials of civilization, the laws against murder, theft and harm, the redress of grievances, etc. Then they do this shift: There seems to be a lot of power lying around, unused, how can I/we use this to elevate my/our standing in the community?

Then there follows a lust for power. It starts off as a quest for greater influence; anybody participating in a decision making process within a group, particularly a group that involves multiple competing interests, is going to want greater influence. It’s only natural. But influence is not power. I might even go so far as to argue they have an oppositional relationship with each other: Influence has to do with the actions of people who still have choices they can make. Power has to do with the actions of people who have been “liberated” from their choices. If I have power over you, that must mean I can make you do things even if you don’t want to do them. If my power doesn’t extend into the realm of making you do things you otherwise wouldn’t do, “power” isn’t really the right word.

The point is, we have this tipping-point, within each participant, where they stop trying to acquire influence and start trying to acquire power. President Obama is past that, it seems to me; His influence is clearly on the wane, and it doesn’t seem to bother Him even a tiny bit, but He sure seems to like acting out this little routine He’s got going where He “decides” on this, that, or some other thing…and that’s it. Whatever anybody else has to say about it, is reduced to a nullity. It’s been an impressive experience watching Him go through this transformation, but it isn’t just Him doing it. And when enough of the influencers give up on acquiring influence, and shift to the acquisition of power, it has an effect on the government as a whole. This endeavor to acquire more power, for those in a position of acquiring it, becomes a newer, displacing purpose.

So when left-wingers explain they are for progress and going “forward,” and their opponents the right-wingers are about resisting this “progress,” interpreting it this way you can see they’re quite correct. These civilizations are rather like harvested fruit — starting out delicious and beautiful, ending up colorless, decayed, foul, unfit for anything but compost. It is a depressing thought to entertain that perhaps this transition is unavoidable, the only question outstanding being how soon. The left wants it to happen faster, the right wants it to happen more slowly, or not at all.

Now that we have these intra-factional shouting matches about Establishment vs. Tea Party, I’m seeing this proven again and again, with increasing frequency and intensity. The “right wing” within the Republican party is seeing the decay happening, looking for ways to forestall it, hopefully get back to the point where we could decide things for ourselves — and get back to operating our businesses, building our services and products, helping others, doing the things that made the country great in the first place. The “moderates,” on the other hand, are losing the characteristics that had distinguished them from the “left wing,” to the point it’s hard to tell those two apart. They’re both hung up on this idea that so-and-so is uniquely-qualified to rise above the rest of the country, and lead it into…well…there’s the part that furrows the brow with concern, and maybe a bit of distress. They won’t say what. They want to monologue away endlessly about the who, but not the what, the why, or how it’s all gonna work. Certainly, not about any benchmarks or milestones by which the Grand Master Plan can be subsequently evaluated, with its planners, architects and overseers held accountable for the results.

He’s the Liberal in This Debate

Friday, January 9th, 2015

Which of course raises the question: When did the definition change?

From The Daily Beast.

Thomas Sowell’s Intellectuals at Harvard

Thursday, January 8th, 2015

Prof. Sowell is asked to define what an intellectual is:

An intellectual is someone whose end product is ideas. Not everybody who produces an idea is an intellectual because there are many intellectually demanding ideas that end up as products or services such as brain surgery or computer operating systems, etc. But those kinds of things differ in the sense in that there is an external test of the validity of the ideas, other than the approval of one’s peers. For deconstructionists, the only test is whether other deconstructionists like what he is saying. But for a financial wizard, he may be held in awe by his contemporaries and yet if he goes broke his ideas are regarded as failures. Consider that between the two World Wars, intellectuals promoted pacifism to the point they impeded the military build up of any military deterrents against Hitler or Japan, and yet men paid with their lives in the beginning of the war especially because Britain and America had far inferior military equipment. Men died needlessly but no one ever held them accountable for what they said.

Perhaps it is still unclear? If you need to see it in action, look no further than Harvard (via American Thinker):

For years, Harvard’s experts on health economics and policy have advised presidents and Congress on how to provide health benefits to the nation at a reasonable cost. But those remedies will now be applied to the Harvard faculty, and the professors are in an uproar.

Members of the Faculty of Arts and Sciences, the heart of the 378-year-old university, voted overwhelmingly in November to oppose changes that would require them and thousands of other Harvard employees to pay more for health care. The university says the increases are in part a result of the Obama administration’s Affordable Care Act, which many Harvard professors championed.

The faculty vote came too late to stop the cost increases from taking effect this month, and the anger on campus remains focused on questions that are agitating many workplaces: How should the burden of health costs be shared by employers and employees? If employees have to bear more of the cost, will they skimp on medically necessary care, curtail the use of less valuable services, or both?

“Harvard is a microcosm of what’s happening in health care in the country,” said David M. Cutler, a health economist at the university who was an adviser to President Obama’s 2008 campaign. But only up to a point: Professors at Harvard have until now generally avoided the higher expenses that other employers have been passing on to employees. That makes the outrage among the faculty remarkable, Mr. Cutler said, because “Harvard was and remains a very generous employer.”

In Harvard’s health care enrollment guide for 2015, the university said it “must respond to the national trend of rising health care costs, including some driven by health care reform,” in the form of the Affordable Care Act. The guide said that Harvard faced “added costs” because of provisions in the health care law that extend coverage for children up to age 26, offer free preventive services like mammograms and colonoscopies and, starting in 2018, add a tax on high-cost insurance, known as the Cadillac tax.

Richard F. Thomas, a Harvard professor of classics and one of the world’s leading authorities on Virgil, called the changes “deplorable, deeply regressive, a sign of the corporatization of the university.”

I was the other peopleMary D. Lewis, a professor who specializes in the history of modern France and has led opposition to the benefit changes, said they were tantamount to a pay cut. “Moreover,” she said, “this pay cut will be timed to come at precisely the moment when you are sick, stressed or facing the challenges of being a new parent.”
:
“It seems that Harvard is trying to save money by shifting costs to sick people,” said Mary C. Waters, a professor of sociology. “I don’t understand why a university with Harvard’s incredible resources would do this. What is the crisis?”

Gee, I dunno. What could it be?

The power of the horse blinders is just dazzling, mind-blowing. Especially with that professor-of-classics guy whining about “corporatization.” Corporations have been around for a long time. What does it take to live through the last six years and watch this “landmark reform” of the health care system, by government, with the government legislation and the government enforcement and the government scandals and government this and government that…soon afterward there follows a change you don’t like, and you leap to the “corporatization” angle to explain it? Corporate action on this has been passive and reactive. That’s how it works with the law. Corporations sit around, and watch, see what they’re required to do & what they’re allowed to do, then they make a plan to navigate through it all. So if there’s a sudden change in ultimate effect, coming after a sudden change in the law, it’s probably not a coincidence and it’s probably not because a corporation rolled out of bed with a hot new idea.

There are those who may complain, with some legitimacy, that a bit too much energy is expended on noticing how self-deluded today’s Obama supporters and ObamaCare supporters and hippies and proggies and lefties and liberals really are. I can certainly see the concern. But it’s episodes like this that provide the rebuttal, that show the necessity of noticing. Just look at the simplicity of the ideas that aren’t taking hold, somehow: “If you pay for more stuff, or force someone to pay for more stuff, it’s going to cost more money.” Intellectual or not, you have to be standing in a very low position for an idea such as that to go sailing over your head. But you can tell from the quotes that these intellectuals managed to get ‘er done. Outrage in year N that such-and-such a thing is “not covered” and the suffering do not have “access” (kaching, kaching) to health are. Then government rolls out the plan, and we have outrage in year N+1 that costs are going up. Uh, yeah costs are going up. Of course they are. They’re supposed to; the system is paying for more stuff. Duh.

The takeaway from this? Ideas shape the mind, just as the mind shapes the ideas. We all know this is true, we just don’t talk about it as often as we should. There is a certain discipline involved in doing the brain surgery, computer operating systems, other “intellectual” pursuits that aren’t intellectual in vocation because they involve some external test validating the merit of the idea. There is an entirely different discipline involved in the formation of ideas that are never to be put into practice, never to leave the realm of ideas, in the manufacture of consensus that is “right” because, and only because, it is consensus. When “the only test is whether other deconstructionists like what he is saying.”

It rots the brain.

Five Feminists Myths That Will Not Die

Tuesday, January 6th, 2015

TIME. Typical lefty balderdash, the kind of nonsense one should expect to culminate from an appeal to emotion, laboring under the guise of appeal to reason:

““The figure was made up by someone working at the UN because it seemed to her to represent the scale of gender-based inequality at the time.”

It isn’t just feminists. You have to be very careful accepting “facts” from someone who perspires away under a bit too much of an afterglow that comes from winning arguments. Not too much terrain is traversed before winning-the-argument is all that matters anymore.

Feminists do love to win arguments, though. As the five chestnuts make abundantly clear, they are not to be trusted.

Don’t Punish Sex Offenders if They’re Black

Tuesday, January 6th, 2015

Judith Levine writes in Counterpunch:

If it’s true that all seven of the football players arrested for hazing in the Sayreville, New Jersey, War Memorial High School locker room are students of color, that is one more reason not to prosecute them as sexual felons.

I don’t mean not to prosecute them in adult court. I mean not to prosecute them at all.

If they’re guilty, they should be disciplined by the school, kicked off the Bombers team, and held accountable to their victims by making amends in words and deeds.

But the punishment the state will mete out far outweighs the transgression. For kids who are 15 to 17 years old, it will be life crushing.
:
Now we find that a disproportionate number of the people on the registries are also African-American. This is surprising only because the popular image of the sexual “predator” is a “pedophile,” and the pedophile is white.

In fact, whites represent two-thirds of registered offenders—the unique criminal category in which whites show up in proportion to their demographics in the general population. But on the public registries, “blacks appear to be over-represented,” according to an ongoing analysis by University of Washington criminologist Alissa Ackerman and colleagues of over 445,000 sex offenders on public registries in 2010.

Nationally, African-Americans comprised 22 percent of the Ackerman sample, compared with only 13 percent of the U.S. population. Among the states with the greatest mismatch was New Jersey.
:
The age of the greatest number of people involved in the criminal justice system for sex offenses is 14. Thank age-of-consent laws for that. Because the laws deem minors categorically incapable of consenting to sex, any sexual contact with a minor is considered an assault. Indeed, if the victim is a minor, sexual assault becomes “aggravated” sexual assault. Aggravated does not mean more sadistic or lengthy. It can just mean the “victim” of a touch or chat room conversation was 13.

Fourteen is also the age at which the federal government requires committers of certain sex crimes to be listed on the Internet registries.

And in a nation already overflowing with prisoners both juvenile and adult, the vast majority of them black and brown, do we need to lock up more black and brown kids?

So weird how proggies do this. “Such-and-such a law is unfair, so do not enforce it — in this one particular case, in which I hope to whip up emotional agitation with my prose.” If a law is unjust, how do we go about any hopeful process of reform, that begins with class-based selective exemption from the unjust law? That’s the one question they’re never able to answer.

Blue State, Red State: Which is Richer?

Tuesday, January 6th, 2015

Tim Worstall, writing in Forbes, critiques a New York Times piece:

Blue states, like California, New York and Illinois, whose economies turn on finance, trade and knowledge, are generally richer than red states. But red states, like Texas, Georgia and Utah, have done a better job over all of offering a higher standard of living relative to housing costs. That basic economic fact not only helps explain why the nation’s electoral map got so much redder in the November midterm elections, but also why America’s prosperity is in jeopardy.

Red state economies based on energy extraction, agriculture and suburban sprawl may have lower wages, higher poverty rates and lower levels of education on average than those of blue states — but their residents also benefit from much lower costs of living. For a middle-class person , the American dream of a big house with a backyard and a couple of cars is much more achievable in low-tax Arizona than in deep-blue Massachusetts. As Jed Kolko, chief economist of Trulia, recently noted, housing costs almost twice as much in deep-blue markets ($227 per square foot) than in red markets ($119).

Worstall points out the obvious:

Yes, sure, income inequality might be important in a way, wealth inequality should have a place in our thoughts. But what really matters to people about how life is lived is consumption. Levels of consumption and also consumption inequality. That last is important in a political sense currently because consumption inequality just hasn’t widened out as much as income and wealth inequality have. And levels of consumption: well, that’s really what income or wealth is, the ability to purchase consumption. And if you’re in a place where prices are lower, leading to greater consumption (whether of food, or square feet of housing, or leisure, or whatever), well, then you’re richer, aren’t you?

And thus is our conundrum solved. The red states aren’t in fact poorer than the blue states. They’re richer: that’s why they vote more conservative and more right wing.

This is something that often gets ignored in the comparison between red-state and blue-state economies: $119 is just over half of $227, that’s a pretty big spread. What good does it do you to make $150k a year as opposed to $60k, if you can’t buy as much with it? And don’t these “poor” people who need the help from these blue-stater policies, have to live somewhere?

The blue-state/red-state split, these days, ultimately comes down to a conflict between immediate gratification and delayed gratification. We therefore should not show any surprise on learning of higher salaries in the part of the country that lusts after immediate gratification. It goes with the territory. Just as we shouldn’t be surprised to find a more self-sustainable economic system, in which consumers are empowered to do more consuming, in the part of the country where delayed gratification is more highly valued. That, also, goes with the territory.

Seven Tips From Cosmo That Will Put Him in the Hospital

Monday, January 5th, 2015

Five years of dust on this one, but it’s a fun headline.

And, it offers an excuse to embed an image of Olivia Munn’s “Atari” photo shoot, and who can ever complain about that.

One out of Every Five Will Get Raped

Monday, January 5th, 2015

Pretty vicious. But when devastation is heaped upon an argument merely by taking the points seriously, the damage was deserved.

Remember, if it’s a statistic, and democrats repeat it over and over again, it’s probably bullshit. It’s a simplistic formula, but it works.

Reynolds University. The university where nobody gets raped.

Men Who Don’t Work

Monday, January 5th, 2015

From about a month ago, the New York Times has an animated graphic that will shock the shit out of you.

In the late 1960s, almost all men between the ages of 25 and 54 went to work. Only about 5 out of every 100 did not have a job in any given week. By 2000, this figure had more than doubled, to 11 out of every 100 men. This year, it’s 16. (People in the military, prison and institutions are excluded from these figures.)

Of course, the economy was stronger in 2000 than it is today, with a lower official unemployment rate — the share of people not working and actively looking for work — than today. But for prime-age men, the rise in official unemployment explains only about one-third of the increase in not working.

The remaining two-thirds is made up of those who are not working and not looking for work. [bold emphasis mine]

Now, why is that? We would be well served to, before working too long & hard looking for the answer, first acknowledging: There are a lot of loud, noisy people who like it that way. Or think they like it that way. Men working: Bad. Men not working: Good.

I dissent. Item #3 of my 42 definitions of a strong society:

3. Men do things. Able-bodied men, of all ages, are knights. They defend women, children, old and handicapped people, from trifling inconvenience as well as danger and bodily harm. They never, ever remain sitting when a lady approaches.

And…they work.

“Why Are Fascists Portrayed as Conservatives?”

Monday, January 5th, 2015

Bill Flax, writing at Forbes, linked by Trevor Loudon:

In Argentina, everyone acknowledges that fascism, state capitalism, corporatism – whatever – reflects very leftwing ideology. Eva Peron remains a liberal icon. President Obama’s Fabian policies promise similar ends. His proposed infrastructure bank is just the latest gyration of corporatism. Why then are fascists consistently portrayed as conservatives?

Well obviously, the answer is: Because it is friendly to the liberal agenda to portray fascists as conservatives, and what liberals want, they get. But that raises the question: How come it is that liberals keep getting what they want, when it comes to writing down history?

The short answer is, because they do most of the talking.

Long answer is, we have reached an era in which talking has become more-or-less mutually exclusive from doing. If you can manage to do enough talking, and recruiting others to do your talking for you, within a bureaucratic mess that will tolerate no dissent, you’re probably not a producer of consumable goods. If you’re a producer of consumable goods, you are probably too busy to do much talking. Which is a shame, because as a producer of consumable goods, you have to make sure your shit works or else you aren’t going to have a paycheck tomorrow. Which suggest that when you do get around to talking, your talking might be on the boring side, but there’s merit in it and it’s a worthy decision others make to go ahead & listen to it.

When you talk for a living, on the other hand, your paycheck comes in when you…talk. And you know what you’re saying must be right because you…we-ell…say so. Your friends all say so. And they know they’re right because they agree with you. They know you’re right because you agree with them.

It even works with economics, in which we can see with our own eyes what works and what doesn’t. Nevertheless we still have “economists” who say Obama’s policies are good, and would be even better if only He could get more power. In other words, they say the precise opposite of what the evidence says, and in this so-called “science” they get away with it. Well, on the ladder of testability, it turns out history is on the next rung down. We “know” whatever someone took the time to write down. As far as verifying it? All we can do is recall, footnote where we can, and guess. There’s a lot of uncertainty involved; some people choose not to acknowledge it just because they can’t comprehend uncertainty, which ends up being a very silly way to go about studying history.

But we have a wedge driven between academe and reality now. And that’s how it’s done now, through the magic of manufacturing consensus by booting out dissent, and then calling it science. That is why fascists are portrayed as right wing, even though the evidence clearly shows they are left wing.

Trauma

Sunday, January 4th, 2015

Chris Hernandez on the changing definition of the word in the title, and how it’s affected by all this noise about so-called “microaggressions” and “trigger warnings.” He starts off with several paragraphs of anecdotal example to define what his own understanding has been. Since this includes experience as a cop and a U.S. Marine, these are not stories for timid readers. Teen suicide, people burning to death in helicopter crashes, toddlers getting decapitated in car accidents, et al.

Then he begins to inspect what has been changing lately (H/T: Instapundit).

I suppose I’ve always defined “trauma” the traditional way: a terrible experience, usually involving significant loss or mortal danger, which left a lasting scar. However, I’ve recently discovered my definition of trauma is wrong. Trauma now seems to be pretty much anything that bothers anyone, in any way, ever. And the worst “trauma” seems to come not from horrible brushes with death like I described above; instead, they’re the result of racism and discrimination.

Over the last year I’ve heard references to “Microagressions” and “Trigger Warnings”. Trigger Warnings tell trauma victims that certain material may “contain disturbing themes that may trigger traumatic memories for sufferers”; it’s a way for them to continue avoiding what bothers them, rather than facing it (and the memories that get triggered often seem to be about discrimination, rather than mortal danger). Microaggressions are minor, seemingly innocuous statements that are actually stereotype-reinforcing trauma, even if the person making the statement meant nothing negative.

Finally, he goes in for the kill:

I’ve reviewed these reports of “trauma”, and have reached a conclusion about them. I’m going to make a brief statement summarizing my conclusion. While I mean this in the nicest way possible, I don’t want victims of Microaggressions or supporters of Trigger Warnings to doubt my sincerity.

Life is ToughFuck your trauma.

Yes, fuck your trauma. My sympathy for your suffering, whether that suffering was real or imaginary, ended when you demanded I change my life to avoid bringing up your bad memories. You don’t seem to have figured this out, but there is no “I must never be reminded of a negative experience” expectation in any culture anywhere on earth.
:
If your psyche is so fragile you fall apart when someone inadvertently reminds you of “trauma”, especially if that trauma consisted of you overreacting to a self-interpreted racial slur, you need therapy. You belong on a psychiatrist’s couch, not in college dictating what the rest of society can’t do, say or think. Get your own head right before you start trying to run other people’s lives. If you expect everyone around you to cater to your neurosis, forever, you’re what I’d call a “failure at life”. And you’re doomed to perpetual disappointment.
:
If your past bothers you that much, get help. I honestly hope you come to terms with it. I hope you manage to move forward. I won’t say anything meant to dredge up bad memories, and don’t think anyone should intentionally try to harm your feelings.

But nobody, nobody, should censor themselves to protect you from your pathological, and pathologically stupid, sensitivities.

People have been getting traumatized, according to both definitions, for thousands of years now. How come the definition is morphing lately? It’s only obvious that the liberals are bringing it about, partly because we’ve been watching them do it if we’ve been paying attention; if we haven’t been paying attention, we can simply notice that all of the ingredients are there. A cultural change that brings with it a lot more grievance-mongering and complaining, and very little else. The melding of the well-intentioned, who make poor decisions, with those who seek to destroy society as it currently exists and are capable of hardening and executing brilliantly-conceived strategy. Useful idiots sending their own usefulness into an arc of decline. “Education,” formal as it may be, leaving those who are “educated” with less capability to get anything done in life, rather than more.

You watch it awhile and you begin to see where Sen. Joseph McCarthy got off with his famous observation: “If he were merely stupid, the laws of probability dictate that part of his decisions would serve this country’s interest.” You ask yourself: If I wanted to ensure the next generation did absolutely nothing productive, what is the difference between how I would seek to affect them, and what I’m seeing happening?

And then you talk to some of those in favor of the transformation, and you realize these aren’t people who want to destroy anything at all. They truly do care about the feelings of kids who are being (modern-version) “traumatized.” They just don’t seem to understand how people find maturity before doing productive things in adulthood, and because of this lack of understanding they make awful, terrible decisions. Then, you realize you’re watching the ultimate nightmare juxtaposition: The poorly-intentioned leading the well-intentioned, but poor decision makers, around by their dumb noses. And, you realize our society is attacked from within on yet one more front. You realize what you’re seeing is liberalism, which destroys everything it touches. Everything.

“I Would Like to Elect in 2016 a President Who Loves America With All His Heart”

Saturday, January 3rd, 2015

Christopher Chantrill writes in American Thinker:

I don’t mean that the president should be a Polly-Anna and pretend that everything is hunky-dory in America. I just want a president that wants to fix America because he loves America.

Unlike our liberal friends and their poster boy, President Barack Obama.

The problem with our liberal friends is that they think that they are too evolved to descend to the celebration of a nation-state and its flummeries of patriotism and flags and Pledges of Allegiance. And so thinks President Obama. Liberals are globalists; they are cosmopolitans. And so is President Obama. They believe in supranational governance with the EU and the UN. And so does President Obama.

They do not love America; they sneer at America.

An idea whose time has come?

Memo For File CXCI

Wednesday, December 31st, 2014

So, this happened…

Number Eleven, yay! Our proggie friends will be quick to remind you, though…along with our hippie friends, and who can blame them…there’s a dark side to this. “Don’t encourage him, nothing good can come from it!” Heh. They’re right. Or, mostly right anyway.

On the other hand, there is one good thing about it. If you follow the link, and you have a Facebook account that allows you to read, you’ll notice there is a lively discussion ensuing under the statement that won this coveted honor. Said lively discussion is drawing topic drift the way a new wool sweater attracts cockleburs, and cat hair. So the award offers the opportunity to pick up where the earlier observation gave way to all these charged agendas…like, thou shalt not speak ill of ANY immigration, legal or otherwise…makes us look like racists…what broke Detroit…

Here’s the full statement, in context:

We have things that are supposed to be illegal that “really” are not, like sneaking across our national border, and we have things that are supposed to be allowed but are “really” against the “law,” like for example forming a Young Conservatives group on a college campus, smoking cigarettes, or wearing a “Proud to be American” tee shirt to a high school on Cinco de Mayo.

It is dangerous, living under two sets of laws like this. It’s not an Obama problem, it’s a baby-boomer problem. The hippies have reached the age where they’re expected to be in charge of things, but they still want to rebel against authority when they are the authority. Their generation has manufactured a contradiction which, I’m afraid, is not finished with doing all its damage yet.

Elsewhere, I waxed lyrically of the great schism that is taking place: People are arguing about definitions. I summarized it more elegantly off-line, in an e-mail:

Our “civilization” at the moment…is embroiled in a cold civil war, in part because it has grown quite the appetite for young people who are not curious, youth who have little or no use for definitions, who can easily be told what to think. This cold civil war is between people who refuse to define things, and people who MUST see to it that things are strongly defined before they can do what they do.

Architects and Medicators. People who solve the problems they encounter by way of thought, versus people who address every thought-challenge by way of feeling, often losing sight of the difference between feeling & thought. The “cold civil war” is still cold, but it’s been heating up for awhile, and is approaching an ignition point as we close out Anno Domini Twenty Fourteen.

I continued this observation in Thing I Know #435:

I notice there is an ability some people have and some people do not have. We might think of it as the ability to comprehend definitions that have provide no objectively discerned meaning, applying interpretations that require the human element. Is this room tastefully decorated, is that joke funny, is it fun to watch that person give a speech. In our time, this ability is generally mutually exclusive from the ability to perceive truth. It isn’t hard to demonstrate: Was so-and-so only kidding when he said such-and-such. We see people heckled, ridiculed, scolded, for failing to “get the irony” or for having taken something too literally. The danger involved in diagnosing learning disabilities in, and prescribing medication for, these people is that it sidelines most of the people who might have the ability to get something useful built. An irony-genius, or denizen of a relative-reality universe, isn’t in a good position to build anything involving any level of complexity because you have to perceive hard, concrete, cause-and-effect relationships to do things like that.

People like me who entirely lack that other ability, that “comprehend fuzzy definitions” ability, actually can get irony pretty well. Matter of fact, we can see irony better than those who accuse us of not being able to get irony.

For example: The irony of this cold civil war, in which those who seek to avoid definitions, find they must labor toward entirely defrocking the other side of any status or influence whatsoever, so that nobody of note or significance is taking the time or trouble to define anything — is this. Should they win this cold civil war, they will lose everything. I mean everything. The things they want, the things they need, all these things rely on something being properly and meaningfully…defined.

But the cold civil war is approaching some sort of flash-point. Or anyway, it’s in some state of ascension, unprecedented ascension, about to get as bad as it can get. Because those who have worked their entire lives to rebel against authority, now find themselves in the position of being the authority. It’s on them to find some way to reconcile this. It’s a job that can only be done poorly, or not at all.

And so, as I said, we have two sets of laws. We have things that are illegal but “really,” wink-wink nod-nod, aren’t. And other things should be legal — in fact guaranteed rights — but actually are, wink-wink-nod-nod, Verboten.

Right or wrong, good or bad, that’s where twenty-fifteen finds us. Can’t wait to see what happens next. Happy New Year!

Can’t Find Enough (Excellent) Programmers Here

Tuesday, December 30th, 2014

Paul Graham says

American technology companies want the government to make immigration easier because they say they can’t find enough programmers in the US. Anti-immigration people say that instead of letting foreigners take these jobs, we should train more Americans to be programmers. Who’s right?

The technology companies are right. What the anti-immigration people don’t understand is that there is a huge variation in ability between competent programmers and exceptional ones, and while you can train people to be competent, you can’t train them to be exceptional. Exceptional programmers have an aptitude for and interest in programming that is not merely the product of training. [1]

There’s a footnote there. What’s the footnote?

[1] How much better is a great programmer than an ordinary one? So much better that you can’t even measure the difference directly. A great programmer doesn’t merely do the same work faster. A great programmer will invent things an ordinary programmer would never even think of. This doesn’t mean a great programmer is infinitely more valuable, because any invention has a finite market value. But it’s easy to imagine cases where a great programmer might invent things worth 100x or even 1000x an average programmer’s salary.

That’s all very true. And yet, there is something about this that doesn’t quite fit. There is a shortage of programmers in the United States who “have an aptitude for and an interest in programming” and it has to be filled by way of immigration? So, the home-grown programmers are programming, but ordinarily and not exceptionally. They’re not thinking outside the box. We need to import some talent to think of these new ideas.

If this really is true and it is causing such a grave crisis — and, in my experience, I’ve not seen much support for this pattern, but that’s anecdotal so let’s let it go for now — the thing for us to immediately ponder is not how we can tinker with our immigration quotas, but what might have led us culturally to this sad state of affairs. What is the experience of a home-grown programmer with “an aptitude for and interest in programming that is not merely the product of training”? What becomes of his ideas? How welcome are they? How much resistance does he encounter when he comes up with them? Or she. What are the consequences, stateside, for thinking outside of the box?

I’ve heard others gripe about the “Not Invented Here” syndrome. I’ve had my taste of it. Hearing about it from others preceded my own experience with it, so I know it isn’t just me. Great programming implies great engineering, and if we’re going to value great engineering, we’re going to be attacking problems at their roots, as in root causes. Not hacking away at the leafy part. This is America. Whether you want to argue the point about whether we deserve our reputation for creativity and innovation, we do have it, and we had to have gotten hold of it somehow.

If American schoolkids show a little bit too much creativity where they’re not supposed to, they get medicated until they stop showing it. Are these technology firms, so desperate to get hold of this exceptional programming talent, but failing at it and being forced to ship the talent in from overseas, weighing in on this? Before you accuse me of topic drift, keep in mind I’m merely taking Graham’s argument seriously and this question just arises naturally out of that. It would be dumb of them not to do something to exert influence here, if the crisis is so acute.

And how acute is it?

The anti-immigration people have to invent some explanation to account for all the effort technology companies have expended trying to make immigration easier. So they claim it’s because they want to drive down salaries. But if you talk to startups, you find practically every one over a certain size has gone through legal contortions to get programmers into the US, where they then paid them the same as they’d have paid an American. Why would they go to extra trouble to get programmers for the same price? The only explanation is that they’re telling the truth: there are just not enough great programmers to go around. [2]

I asked the CEO of a startup with about 70 programmers how many more he’d hire if he could get all the great programmers he wanted. He said “We’d hire 30 tomorrow morning.” And this is one of the hot startups that always win recruiting battles. It’s the same all over Silicon Valley. Startups are that constrained for talent.

Another footnote. What’s this one say?

[2] There are a handful of consulting firms that rent out big pools of foreign programmers they bring in on H1-B visas. By all means crack down on these. It should be easy to write legislation that distinguishes them, because they are so different from technology companies. But it is dishonest of the anti-immigration people to claim that companies like Google and Facebook are driven by the same motives. An influx of inexpensive but mediocre programmers is the last thing they’d want; it would destroy them.

Well, that last part is simply not true. Microsoft got fed up awhile back with running into expensive reversals due to the human factor, even in situations in which everybody had done their jobs competently, even excellently. And so they did something Graham hasn’t done here. They defined something (via Coding Horror):

Just last week we were having a meeting where the subject of personas came up. This may have been blogged about in the past… but… we have three primary personas across the developer division: Mort, Elvis and Einstein.

Mort, the opportunistic developer, likes to create quick-working solutions for immediate problems and focuses on productivity and learn as needed. Elvis, the pragmatic programmer, likes to create long-lasting solutions addressing the problem domain, and learn while working on the solution. Einstein, the paranoid programmer, likes to create the most efficient solution to a given problem, and typically learn in advance before working on the solution.

So, the CEO of the startup would hire 30 Einsteins tomorrow morning, is that what we are to infer from this? Graham doesn’t say because he doesn’t make the distinction. From my own experiences, I would have to doubt this very much. You’re apt to be just as frustrated trying to get an Einstein or an Elvis to do Mort work, as the other way around. The “Mort,” when all’s said and done, tends to be the most precious. At least, in the sense being discussed here, in supply versus demand. You always need more Morts.

Think of the conceptual knowledge as a large cake, not only gargantuan in size, but expanding continuously. I myself have sometimes compared programming personas to spatulas and icepicks. Some programmers move laterally, as if spreading frosting, not penetrating much. Some stab through and drive downward all the way to the pan. The problem with these icepick people is they don’t generate this horizontal movement too quickly. And this creates a supply-and-demand issue not too friendly to them, because when you’re talking about staffing a campus full of buildings with tens of thousands of “programmers,” the horizontal frosting-spreading motion is what you really want. Yes you need a few of the vertical-stabber-learners who can isolate just a few particularly arcane fields and then learn every little facet of those fields. But you won’t fill up buildings with hundreds or thousands of those. You’re going to fill them with Morts.

And, these tech firms are doing exactly that. Consider what would happen if this were not the case. Think of the math. Two hundred people to a floor of a building, between two and four floors to a building, somewhere between five and fifty buildings to a campus. Well up into the thousands…and then you go to the next campus in the next city, and count it again. Then you go to the next company that has buildings on campuses, and count it again. This is in service of “a great programmer might invent things worth 100x or even 1000x an average programmer’s salary”? The math doesn’t work. These aren’t lottery tickets that you buy for a dollar and then toss aside when the game is over. These are smart, talented people, whose time is valuable. But, there is a misstatement being made here about what exactly it is that they do.

And then there’s this counterpoint. It’s completely devastating:

No, we need to leave all those people sitting at home and rewire all our immigration laws to ensure that some mythical ‘exceptional programmers’ can get here. We’re not told why that’s such an imperative, it’s just asserted. We’re not told why tech companies are supposed to be so super-picky (and why we’re supposed to indulge this super-pickiness in job-matching), it’s just taken for granted.

This isn’t a minor quibble. I mean by the same sloppy logic –

The US has less than 5% of the world’s population. Which means if the qualities that make someone a great programmer are evenly distributed, 95% of great programmers are born outside the US.

we might just as well assert that 95% of ‘great’, oh, let’s say, construction workers, or librarians, or dental techs, are born outside the US. But I mean, and I hate to be the one to ask this, but so what? Does this mean we need to displace all the domestic construction workers/librarians/dental techs so that we can suck up only the ‘great’/’exceptional’ ones from elsewhere? If so, why? That’s ridiculous. But if not why not, and what’s so special about programming in particular that national policy should be bent around the one pole-star of making sure all companies are packed with supposedly ‘great programmers’, to the exclusion of all other considerations?

I have a “lawyer rule” about this: When someone wants to propose something be done to some particular industry, like should the “workers” be forced to unionize, and then surrender part of their paychecks in “dues” that go to supporting democrat politicians whether they want that to be done or not — ask yourself, could it, should it, would it ever work the same for lawyers? Is anyone going to insist that lawyers unionize, and pay dues to elect politicians they don’t like? That agenda item fails the test. And, so does this one. Lawyering, if you need it, is critically important. Nobody ever asks for an “adequate” lawyer, especially someone who’s in need of a divorce lawyer. Should we start meddling with the immigration quotas so we can import some really excellent divorce lawyers? After all, it stands to reason 95% of them weren’t born here!

So in conclusion, my inferences about this are,

1. Graham is right. Americans may have a commanding lead over the rest of the world in using this-or-that technological toy, but they don’t lead the rest of the world in understanding how the toys work, and that is bound to mean that the United Stats doesn’t possess any sort of monopoly on the world’s programming talent.

2. However, it is dishonest to suggest our domestic resources have been exhausted, or that these tech firms are struggling in futility trying to locate creativity, inventiveness — those things that have historically been part of the American landscape. If that was really what they were trying to find, as they hire new bodies by the tens of thousands, and we knew they were starting here and then ending up begging for immigration quotas to be lifted, then that ought to sound big, loud alarm bells with anyone. Heck, it should sound those alarm bells with the situation the way it is. But that’s another matter.

3. I’m sure there are programmers coming in under Visas earning salaries comparable to the few programmers who were born here or live here, and manage to find work. But there are a lot of costs outside of salary. Before you hire the talent, you have to find the talent. How expensive is it to find the talent here in the United States? How sure is the process, how much faith can you put into it?

4. What Congress is being asked to do, by these tech firms, is to assist them as they give up hope in the country. You can say the situation is not that simple, but you’d be wrong. And this is a significant point, since Congress is the country. This country has a legacy of inventing new things, thinking up new ideas, then making them happen. If that’s nothing more now than an echo in the ash bin of history and the snapshot of the present is some approximate reverse of that, the concerned patriot should be asking why. If his attention is not focused there, he forfeits any claim of concern about the country’s future.

5. We have advocates for more imported and excellent programmers, but not advocates for more imported and excellent divorce lawyers…or bankers or construction workers or dental techs. The issue is not “programming.” If I do yield to the temptation of drawing too much on my own experiences, the one thing about this industry that really sets it apart, and might make it a special target for this effort that ignores the other vocations, is what everyone seems to be overlooking: Independence. This makes sense, at least insofar as it’s something that deserves inspection, because independence always scares a lot of people, and it always scares them a lot more than they’re willing to admit. Their own, along with someone else’s. Me, I’m a high school grad. Without programming, I’m supposed to be…I dunno. Construction worker, librarian or dental tech? Guy who stacks the soup cans in a pyramid on aisle four? Something not as independent. Through my mixtures of success and failure, I’ve often had to bear that in mind, without this livelihood I’d be doing something not-as-independent. And, I’ve met my share of people who would like that, a whole lot. I continue to meet them, and be made aware of them. And that’s how I see this Paul Graham piece, to be honest about it. Wonder how he feels about high speed rail? It has not escaped my notice that these people laboring long and hard for more immigration in the tech fields, seem to be the same as the ones who love trains so much.

“Intelligent Design, Anyone?”

Sunday, December 28th, 2014

From behind the WSJ paywall, via Gwynnie at Maggie’s Farm.

In 1966 Time magazine ran a cover story asking: Is God Dead? Many have accepted the cultural narrative that he’s obsolete—that as science progresses, there is less need for a “God” to explain the universe. Yet it turns out that the rumors of God’s death were premature. More amazing is that the relatively recent case for his existence comes from a surprising place—science itself.

Respect the paywall…tease only…

Fred Hoyle, the astronomer who coined the term “big bang,” said that his atheism was “greatly shaken” at these developments. He later wrote that “a common-sense interpretation of the facts suggests that a super-intellect has monkeyed with the physics, as well as with chemistry and biology…The numbers one calculates from the facts seem to me so overwhelming as to put this conclusion almost beyond question.”

It offers food for thought, not only about the origins of the universe, but about the nature of atheism. How can “atheism,” as we have classically known it as more a lack of belief than a presence of it, be “shaken”?

And then there is the other matter: What does it take to make it so?

Related: (update 12-29-14) How’s this for an attention-grabber? severian points out, true atheism actually is not even possible.

Not so sure I can go so far as the disbelieve the disbelievers, although there is a certain delicious irony in entertaining the idea. I have met those who’ve made up their minds there is no god, and insist on the ‘g’ being lowercase if anybody writes about this belief, as I just did. But — by that time, it is a belief. We’re back to that troubling differentiation between the presence of a belief, and the absence of one, and strident atheism most certainly is more presence than absence. It is a catechism of beliefs, sequences of events that provide alternative explanations, whether they can be supported by evidence or not. And how does the militant atheist respond to challenges against these alternative explanations? They simply blot them out, just like any militant religious-person.

By that time, the doubt has become a doctrine. It is a religion, just like any other, simply lacking the omnipotent, omniscient, omnipresent humanoid being. But, religious in all other aspects.

I Made a New Word LXXII

Sunday, December 28th, 2014

Wanna-conomy (n.)

This one is a bit complicated…

Years ago I had learned, along with many other people interested in the subject I suspect, that the forces at work in an economy are supply and demand. My learning was that they are like at opposite ends of a seesaw. If one is in ascension then the other will be in a state of descent, both of these have some effect on the price of a product or service, and eventually things will stabilize — supply, demand and price. If we’re looking at a commodity on the stock exchange, then this search for supply/demand/price stabilization will be renewed daily. Also, an abundance of the one will intensify the power exerted by the other: If many people are demanding a certain item, and there is a limited number of suppliers of it, then the price will go up. If there are many suppliers and only limited demand, conversely the price will go down. This results in signaling. An “economy,” when you study it awhile and think about it awhile, turns out to be nothing more than a network of those signals. Because of this signaling, the tendency is toward benefit for all because the supply will respond to the signals; people will labor toward creating whatever it is that other people need. They’ll move their vocations around, away from the products and services declining in price due to this signaling, toward the products and services that are becoming more precious, so that abundances are relaxed and scarcities are cured.

My new word describes, essentially, a newer economy in which this circuit is being shorted because the whims of the suppliers are unnaturally affecting the nature of demand. Retail consumer technology has devolved to become a good example of this. When a new phone comes out and people line up around the block to get hold of it, there is no tug-of-war between supply and demand because the supply is the demand. Apple made something; I want whatever it is. Have no clue what it does. I just want it.

It can work the other way too. Let’s say my lawn is too big for me and I need to have it mowed. This is one of those jobs Americans won’t do. In that classic economy loaded up silly with signals, there would be no such thing. We lawn-owners would have trouble getting someone to cut the grass for a little while, then we’d offer a bit more money, and more, until finally some intrepid hard-working kid says “Okay, I’m in.” But nowadays, they just don’t wanna. And they’re not the only problem in this, of course. So supply affects demand. Eventually, the homeowner hires an illegal alien to do it, or gives up on the whole cycle, plows up the grass and replaces it with bark.

The point is, when there is a connection between the two ends of the seesaw, it can no longer operate like a seesaw. Today, we still have signaling. But the signaling is, too much of the time, from the suppliers to the those who demand, such that the demand ends up being nothing more than a reflection of whatever is in supply. The suppliers, in turn, then end up doing whatever they wanted to do. It’s then up to the consumer to find a way to make it fit.

The first casualty of this is the signaling. Without suppliers of valueless products being told to go get stuffed, there is no way to measure what does & does not have value to the consumer. And so, after the signaling, the next thing to go is the value. There may be a frenzy of economic activity going on, but without measured value there isn’t much value in the transactions at all, and it all becomes just a bunch of work. Value-less work. Like an electric fan someone forgot to unplug.

At the end of it, you just have a bunch of spoiled, first-world hipsters standing around, each one with a million dollars or more in his bank account, but capable of buying nothing with it, holding signs on the sidewalk that say “WILL SURF THE INTERNET FOR FUD.” That’s the ultimate consequence. I think we’re well on our way at the end of 2014:

Fiber optics. New technologies. Obsolescence. We’re dead alright. We’re just not broke. And you know the surest way to go broke? Keep getting an increasing share of a shrinking market. Down the tubes. Slow but sure.

You know, at one time there must’ve been dozens of companies makin’ buggy whips. And I’ll bet the last company around was the one that made the best goddamn buggy whip you ever saw. Now how would you have liked to have been a stockholder in that company?

In order to understand what is going on with the US of A at the end of 2014, you have to understand the changing global dynamic. It isn’t just new technology. It’s the difference in what happens to you, if you keep making the best goddamn buggy whip anyone ever saw.

The lunacy of the times in which we live is that the spirit of Danny De Vito’s speech about obsolescence, has itself become obsolete. A company that makes buggy whips won’t be “dead, just not broke,” staggering on for a year or two before collapsing into debris. These days, suppliers decide demand. Apple has proven it over and over again. So with a little bit of advertising, in such a scenario there would emerge a crushing demand for buggy whips, to match the undemanded supply.

Sounds wonderful, doesn’t it? It isn’t. It’s terrible. It’s a disease. The market has lost its ability to send signals, which when you think about it, you come to understand that that’s all a market really is, a network of signals.

Our “market” produces an awful lot of stuff nobody ever needed. The people who produce these things are terribly busy meeting deadlines and they may have a great sense of purpose about them. But too many among them are missing actual customers. Or, have customers who are reluctant customers, customers regulated into being customers. Like “clients” of a collection agency. Or, customers buying the product without any need for it in mind, often buying the thing just to find out what it does.

REAL demand would say: We need more American kids learning about technology, starting with how many bits are in a byte, then working their way up to coding, software construction, then design, then project management. Then, studying the mistakes of previous technology pioneers who started companies that way, and then messed up & killed their own companies. In the meantime, if any houses or yards or orchards have to be purchased, do the yardwork on those. But demand is dictated by supply now. American kids don’t want to pick fruit, and they don’t want to pick bits out of bytes either. So we have illegal immigrants pick the fruit and cut the grass, and among our natives who actually want to do something with technology, it seems the thing to do has become to create more & more & more certification tests. Why bother with being the guy who builds stuff, when you can be the guy who dictates how other people build stuff? Just like, why be the guy who cuts grass, when you can be the guy who hires the illegal alien to cut the grass?

I’m convinced, at this point, if we were all a bit more motivated to ask on a daily basis “Waitaminnit, am I building a buggy whip?” — there’d be a lot more Americans cutting their own grass and picking their own fruit, nudging their own kids to do those things, dishing out the “You’re under my roof” speech my peers heard so much during our own childhoods. And things would be different in technology. Lots more new products. Not games. Not yet-more computer languages. Regulatory requirements, in Year N, would be a lot more similar to the ones in Year N+1, or Year N-1. Our innovation would not be in innovating new rules. It would be in building things other people can actually use. We seem to have gotten away from that a bit.

How did it happen? What it is, and has been, is an invasion. There is work that has something to do with supply-and-demand, and then there is work that doesn’t have to do with this. Non-producing work. Unproductive work. The electric fan someone forgot to unplug. Stuff you have to buy even though you’d rather not; regulated stuff. Supply that dictates the demand.

The unproductive work has been invading the productive work. And then, as invaders always do, it has started to tell the invaded what’s-what, and what-for. It began with the bureaucrats. Non-producers who want to dictate to producers how, when, and where they do their producing. The consumers are then left to consume whatever is produced — and, not to question it.

Dave Barry’s Year in Review 2014

Sunday, December 28th, 2014

Yeah, and I can see what he did here

April
:
On the domestic front, U.S. Secretary of Health and Human Services Kathleen Sebelius, who oversaw the rollout of Obamacare, resigns from the cabinet to take a position overseeing email storage for the Internal Revenue Service.
:
June
:
In Washington scandal news, the Internal Revenue Service, responding to a subpoena, tells congressional investigators that it cannot produce 28 months of Lois Lerner’s emails because the hard drive they were stored on failed, and the hard drive was thrown away, and the backup tapes were erased, and no printed copies were saved — contrary to the IRS’s own record-keeping policy, which was eaten by the IRS’s dog. “It was just one crazy thing after another,” states the IRS, “and it got us to thinking: All these years we’ve been subjecting taxpayers to everything short of rectal probes if they can’t produce EVERY SINGLE DOCUMENT WE WANT, and here we lose YEARS worth of official records! So from now on, if taxpayers tell us they lost something, or just plain forgot to make a tax payment, we’ll be like, ‘Hey, whatever! Stuff happens!’ Because who are we to judge?”

But all kidding aside, you can bet that before this thing is over there will be a strongly worded report.

These things always make me think something like “Oh yeah, I’d forgotten about that.” The Lois Lerner thing is seriously depressing, you have to discard so much common sense in order to take the dog-ate-the-homework excuse seriously, after awhile it just gives you a migraine.

And then you think — wait, this is how the administration behaved before losing the last election to which they will ever be beholden. Let’s face it, good stuff happened in 2014, bad stuff happened in 2014, all in all it was a very wild, loud year, and most of us are hoping 2015 will be a bit quieter. In the category of elected & appointed officials acting like overlords who own us lock stock & barrel, and putting on these little shows to pretend they’re somehow accountable to us, hoping to fool only those who follow along most casually and think most slowly…there isn’t much reason to hope for a slowdown.

The IRS told Congress to go stuff it, in response to a subpoena no less, while the midterm elections loomed. Now those elections are over. What happens next?

American Achiever of 2014

Sunday, December 28th, 2014

Sarah PalinWho else?

[Sarah] Palin achieved what such luminaries as President Obama did not: a place in the Smithsonian’s prestigious “Most Significant” list. After being written off by many in the media, and especially the left, as “irrelevant” and predicted by MSNBC’s Krystal Ball as “not going to have an effect on the [2014] midterms,” Palin’s record of success of her endorsed candidates was nothing short of phenomenal.

Governor Palin endorsed 22 candidates for various offices during the midterm finals, including senators, governors, lieutenant governors, congressmen, and attorneys general. Of those so endorsed, an incredible 20 were elected – contrasted with, for example, Hillary Clinton’s record of 8 wins out 24 endorsed candidates.

Beyond the success of her endorsed candidates lies a much deeper reason for Palin being seen as “Achiever of the Year”: those Palin endorsed in their respective primaries who then went on to win the general election battles. As in the past with, among others, senators Ted Cruz, Kelly Ayotte, and Deb Fischer, and Governor Nikki Haley, who owe their elections in their primary campaigns to Palin’s endorsement at a critical juncture, so too could new senators Ben Sasse and Joni Ernst, and new Alaska governor Bill Walker (and, remarkably, his Democrat lieutenant governor Byron Mallott) be considered to owe all or a substantial part of their nominations to Palin’s endorsement.

The Party of Weakness

Sunday, December 28th, 2014

Mona Charen:

Democrats have done very well politically by convincing voters that they are, in a very broad sense, on the side of the little guy. “Republicans,” they say, “take care of the rich, but we Democrats are the party that brought you Social Security, Medicare, Head Start, the Civil Rights Act, free school lunches, Aid to Families with Dependent Children, and Obamacare. We are the party that will tax rich Republicans to fund government programs that help poor and middle-class Democrats.” Like mothers, Democrats are nurturing and supportive.

There are a few little problems with this narrative. Most of those programs, for good or ill, had bipartisan support. Further, Head Start has been a colossal failure; the school lunch program is overly broad and encourages waste; civil rights laws have been interpreted to permit quotas and “reverse discrimination”; AFDC wound up encouraging unwed childbearing and arguably contributing to poverty; and Obamacare is causing the middle class to pay higher premiums for more limited medical care while still covering only a fraction of the uninsured.

Still, the “mommy party” retains its “caring” image.

The Democrats have another reputation. They are perceived as the party of weakness — against both criminals at home and enemies abroad. Events of the past several weeks and months have underlined that second reputation in (pardon the expression) red ink.

The Democratic senators’ decision to release a report excoriating the CIA for “torture” after the 9/11 attacks was designed to impugn the Bush administration and the nation’s security agencies. In weak-minded fashion, Sen. Dianne Feinstein and her colleagues insisted that the harsh interrogation of terrorists was immoral and also completely ineffective.

There is nothing intellectually dishonest about rejecting torture or anything close to it. But when the senators insist that it didn’t work — despite the contrary assessment of five CIA directors of both parties — they betray a fundamental unseriousness. Of course it worked. That’s why it presents a moral dilemma. Otherwise, they’re asserting that the CIA is manned by sadists who did these things for kicks. Presumably, if that had been true, Eric Holder’s Justice Department, which conducted a lengthy investigation, would have brought charges. But it didn’t. Finally, the senators’ claims that they were kept in ignorance have been abundantly contradicted by the public record and by statements from the CIA officers who did the briefings.

The attempt to discredit the Bush administration and engage in moral preening failed. Polls showed that despite the Senate report, Americans support the limited use of harsh interrogation by a 2-1 margin.

I’m not in complete agreement with all this. The corollary of motherhood with the democrat party, to me, seems flimsy. When mothers are nurturing, it’s supposed to be out of a vision that the child should eventually become stronger. Admittedly, some small-em moms present some problems for this, some of them even become emotionally attached to their childrens’ lack of ability to provide for themselves. But, they are the freaks, the exceptions that prove the rule.

The democrat party, on the other hand, is deeply invested in weakness. Of everybody. If we can think for ourselves and provide for ourselves, we’re less likely to vote democrat. They know it, we know it, they know we know, we know they know.

We only pretend otherwise out of a lazy, habitual form of misguided etiquette, just to prop up the empty “not a dime’s worth of difference between the two parties” narrative.

The “party of the little guy” thing doesn’t hold up either

Democrats bagged the bulk of big dollar donations in the 2014 midterm elections according to an analysis by the Associated Press.

Out of the $128 million spent by the top 10 individual donors to outside groups, Democrats hauled in $91 million or 71% of donations.

“Among groups that funneled more than $100,000 to allies, the top of the list tilted overwhelmingly toward Democrats—a group favoring the GOP doesn’t appear on the list until No. 14,” reports the AP.

Democrats also enjoyed a 3-to-1 cash advantage when it came to the 183 groups stroking checks of $100,000 or more. The liberal National Education Association (NEA) topped the list of big money donors at $22 million. The top ten list contained zero Republican-leaning groups.

“They’re total hypocrites when it comes to this subject,” said Republican National Committee Chairman Reince Priebus. “They’ve made a living off campaign talking points when, in reality, they’ve been raking in more money from millionaire donors than Republicans for quite a while.”

You Can Thank the Supreme Court For Credential Inflation

Sunday, December 28th, 2014

Jesse Saffron, writing in National Review Online:

Before the early 1970s, many employers did not require that applicants have college degrees – even for well-paying jobs necessitating advanced skills and intelligence. A high school diploma and a passing score on an employee aptitude test were, in many instances, enough for a worker to advance in a rewarding and lucrative career. Unfortunately, as George Leef points out in today’s Pope Center feature, the Supreme Court’s decision in Griggs v. Duke Power (1971) effectively precluded employers from basing hiring decisions on aptitude test results. The reverberations of that decision are still being felt today.

In Griggs, the Court deferred to the Equal Employment Opportunity Commission’s (EEOC) interpretation of section 703(h) of the Civil Rights Act (CRA), which permitted employers to use a “professionally designed ability test” so long as the test was not “designed, intended or used to discriminate…” The EEOC, which enforced the CRA, had promulgated a broad interpretation of that provision, making it illegal for a test to have a “disparate impact” on minorities. For example, if an employee aptitude test disproportionately weeded out black applicants, it would be considered illegal.

As Leef makes clear, the end result of the Griggs decision was that employers became paranoid about using aptitude tests, for fear of potential litigation costs. Instead, they began to use the college diploma as the new employee screening device. “We probably have a college ‘bubble’ just from the effects of easy federal college aid and the push by politicians for educational attainment, but by making employee testing legally dangerous, the Griggs decision helped inflate it,” he writes.

From the Leef article that was mentioned & linked above,

The justices ignored the legislative history and gave deference to the federal agency charged with enforcing the law, the Equal Employment Opportunity Commission (EEOC).

The EEOC had promulgated guidelines on employment testing. Those guidelines advanced the idea that had been rejected in the debate over the Civil Rights Act, that tests would be illegal if they had a “disparate impact” on minority groups. Furthermore, the EEOC declared that if a test had a disparate impact (that is, minority workers were disproportionately affected), the employer would bear the burden of proving that it had a “business necessity” for using the test.

Chief Justice Burger’s opinion deferred to the EEOC’s reinterpretation of the law. Duke Power was in violation because its educational and testing requirements had a disparate impact on minority workers. The law, he wrote, required “the removal of artificial, arbitrary, and unnecessary barriers to employment where the barriers operate invidiously to discriminate on the basis of racial or other impermissible classification.”

Requiring either a high school diploma or ability to pass the two tests seemed to be artificial, arbitrary, and unnecessary, so out they went.

The full Griggs opinion is here.

The decision, with the ramifications defined above, is — was — a continuation from a regrettable trend that had become pronounced during the Earl Warren era of the Supreme Court: The burden of expectation shifting toward anticipation of what will happen next when someone says “I’ll see you in court.” Progressives, and they are not alone in this, think that makes the Supreme Court decisions great decisions. They make the Supreme Court more powerful, don’t they? So they’re “landmark” decisions. Trouble with that is, there was a reason that people who made these on-the-job judgment calls were forced to reckon with what the Supreme Court would say, should litigation follow: There wasn’t any other way to predict what would happen. SCOTUS became more and more nonsensical, and so “common sense” became unequal to the task of prediction.

And so, “testing” became all about not-testing, just as enforcing the law became all about not enforcing the law. Might as well let the bad guy go, the courts are going to toss the case out anyhow.

An interesting, if unanticipated, side effect of this is that all these years later even a high school diploma isn’t good enough. Not even for picking up dog crap in a park or sweeping leaves off a sidewalk. The history shows that this isn’t due to high school graduates having done an inadequate job of sweeping sidewalks, but rather a power struggle between common sense, and the power lust & overreach of federal agencies.

Global Warming’s Upside-Down Narrative

Saturday, December 27th, 2014

Bjørn Lomborg, by way of Kate at Small Dead Animals:

When politicians around the world tell the story of global warming, they cast it as humanity’s greatest challenge. But they also promise that it is a challenge that they can meet at low cost, while improving the world in countless other ways. We now know that is nonsense.

Political heavyweights from US Secretary of State John Kerry to UN Secretary General Ban Ki-moon call climate change “the greatest challenge of our generation.” If we fail to address it, Kerry says, the costs will be “catastrophic.” Indeed, this has been the standard assertion of politicians since the so-called Stern Review commissioned by the British government in 2006.

That report famously valued the damage caused by global warming at 5-20% of GDP — a major disruption “on a scale similar to those associated with the great wars and the economic depression of the first half of the twentieth century.”

Tackling climate change, we are told, would carry a much lower cost. The president of the European Commission promised that while the European Union’s climate policies are “not cost-free,” they would amount to just 0.5% of GDP. Indeed, politicians of all stripes have reiterated the Stern Review’s finding that global warming can be curtailed by policies costing just 1% of world GDP.

Climate policies, moreover, are said to help in many other ways. US President Barack Obama promised that policies to combat global warming would create five million new green jobs. The EU claimed that green energy would help “improve the EU’s security of energy supply.”

With the completion of the latest report by the United Nations Intergovernmental Panel on Climate Change (IPCC), we can now see that this narrative is mostly wrong. The first installment of the IPCC report showed that there is indeed a climate problem – emissions of greenhouse gases, especially CO₂, lead to higher temperatures, which will eventually become a net problem for the world. This result was highly publicized.

But the report also showed that global warming has dramatically slowed or entirely stopped in the last decade and a half. Almost all climate models are running far too hot, meaning that the real challenge of global warming has been exaggerated. Germany and other governments called for the reference to the slowdown to be deleted.

The second IPCC installment showed that the temperature rise that we are expected to see sometime around 2055-2080 will create a net cost of 0.2-2% of GDP – the equivalent of less than one year of recession. So, while the IPCC clearly establishes that global warming is a problem, the cost is obviously much less than that of the twentieth century’s two world wars and the Great Depression.

Again, not surprisingly, politicians tried to have this finding deleted.

Goddard has been noticing this as well: So fragile is the narrative, that you have to keep deleting inconvenient and inharmonious facts in order to keep it believable. And it seems this is getting harder and harder to do.

Obama the Uncollegial Consensus-Shredder

Saturday, December 27th, 2014

The Anchoress, Elizabeth Scalia, notices President Obama unilaterally altering our relations with Cuba, along with other things; she doesn’t offer a list of examples, but it wouldn’t be hard to compile one. Taking executive action on the illegal alien invasion is another example, as is the reversal of position on gay marriage, of what she describes: “…what he has always really wanted…is to be either a King or a Dictator: someone who decrees something and then briefly tells the world why it should be grateful, before heading off to the links.”

I’ve noticed this for awhile. The process of Barack Obama going off to mull things over in His head, however that is done — seems to receive an awful lot of weight. Infinite weight, in fact, in this new Obama-era vision of how government should work, since nothing else matters. Who are you to say? Who is Congress to say? Heck, who is the Barack Obama of 2007 to say? King Barack went & mulled it over for a bit, why consider anything else?

What’s interesting to me about all of this is not really Obama. I identified his presidential-singularity a long time ago, so nothing he does surprises.

No, what’s interesting to me are the people who are captivated and energized by his authoritarianism, and utterly silent on questions of constitutionality or collegialism.

They’re interesting because — by and large — the people who are cheering Obama’s moves to stop talking and simply push his wishes through, are the same people who gush over the collegiality that Pope Francis is bringing to the leadership of the church.

The pope is reaching out, drawing bishops into discussion; he is bringing them to advisement committees; respectfully hearing them out as first among equals — he’s doing all he can to eliminate the old perception that the papacy is a dictatorial, authoritative office — out of touch with either the leadership or the people he serves.

In general, people think this is a good thing, as do I.

Obama, on the other hand, will not reach out; he will not draw legislators into discussion or bring them in for advisement; he is not respectfully hearing anyone as, as “first among equals.” Rather, he is doing all he can to redefine the presidency as a dictatorial and authoritative office, not only out of touch with either the leadership or the people he (ostensibly) “serves,” but prone toward telling them to eat their peas and take what’s good for them, unless they’re Goldman Sachs.

In general, most people think this is a bad thing. The president is supposed to lead, which means practicing the art of persuasion, of bringing people around; he is not supposed to simply rule.

There are some people out there who are strangely competent at holding two opposing thoughts in their heads: Pope Francis is a collegial consensus-builder, and that is an unqualified good. Obama is an uncollegial consensus-shredder, and that…is also, somehow, an unqualified good.

Something profoundly dishonest in that, don’t you think?

The checks and balances of America’s constitutional republic, and the questions-and-answers they inspire, apart from occasionally annoying America’s First Holy Emperor, apparently are a bit much for a few others to try to handle. They demand a bit of inspection and deep thought, in an era in which people can’t muster up much appetite for such things.

Your Status Updates Are Coming Across a Little Arrogant

Thursday, December 25th, 2014

Finally found this clip again, over here, where I compared the two characters to myself (left) and our good blogger friend in New Mexico (right), who I learned a few days ago shucked his mortal coil.

Seems he noticed the similarity too. Glad he saw the humor in it, and that at curtain call, he was surrounded by family. He was one of our first readers, and over the last decade there were signs the end might arrive while he was in solitude. I recall there was one occasion on which he worried that the his time had come, with no one around. It was obviously not his preference.

He chastised me many times that I needed an editor. He was right about that, I never managed to put the budget together. But he was qualified to criticize, since he had a talent for making fascinating writing out of situations that others, myself included, would not be able to make interesting. And on top of that, he had class and wit. I will miss him. As you can see from the comments, it isn’t just me.

“Scrooge Was A Liberal, Studies Show”

Thursday, December 25th, 2014

John Merline, Investors Business Daily:

Just about every year at this time, “A Christmas Carol’ shows up somewhere on TV, as do headlines about how one Republican or another is the modern equivalent of the tale’s greedy miser, Ebenezer Scrooge.

“The GOP’s sad Scrooge agenda.” “GOP Protecting Ebenezer Scrooge.” “Maher Likens Republicans to Ebenezer Scrooge.” “Republicans play the role of the stingy Scrooge.”

Ebenezer ScroogeYou have to wonder if these folks have actually read “A Christmas Carol” or spent any time pondering what Scrooge actually says and does. Because if you do, you come to realize that Scrooge more closely resembles a modern liberal than a conservative.

A major clue comes early in the story, when two men collecting for charity arrive at Scrooge’s office. After asking Scrooge for a donation to help the poor and needy, Scrooge responds: “Are there no prisons? And the Union workhouses? Are they still in operation? The Treadmill and the Poor Law are in full vigor?”

He goes on to say, “I help to support the establishments I have mentioned — they cost enough; and those who are badly off must go there.”

Modern translation: I pay taxes to support the welfare state, why should I give money to you?

Turns out, that’s a decidedly liberal viewpoint.

And there’s more. There, and here, and here and here. Blogger friend Phil presented the same argument a few years back. We’ve said so ourselves.

It stands to reason, really. Like Robert Mitchell said:

The real difference between conservatives and liberals, today:

Liberal: Someone should take care of this! Or, We need a program to take care of this!

Conservative: ++sigh++ It looks like it’s up to me to take care of this…

Prisons and workhouses, prisons and workhouses…programs. Versus, trot your wrinkled ass down to the butcher shop, and buy the Cratchit family a big turkey yourself.

Just went and watched it last night, the good version, to make sure it’s still good. It is. And it’s always rewarding to see a left-winger turn into a God-fearing, fix-it-yourself, rightward-leaning Tea Party guy…even if it is in Victorian England, and the conversion job requires an extra shove from the supernatural realm.

Of course, some people still don’t get it. There it is again: Someone else should do something, someone else should stop worrying about how expensive something is. A liberal is a fellow so nice, he’ll give you the shirt off someone else’s back. But whatever. Merry Christmas to all.

Rosie and Whoopi

Saturday, December 20th, 2014

Two rich media whore starlets yelling at each other about who makes a better victim.

This is what victimology — trying to create an identity for yourself based on your weaknesses instead of your strengths — does to you. It warps thinking. Even within those who never had a chance to think through anything rationally, it takes whatever lopsided, ramshackle, crooked-line thinking they can manage to bring, and warps it more.

It’s not just these two airheads, it isn’t even just rich media whore starlets. The root cause of the consternation here is the question of influence. There’s an unwritten and unspoken rule here, that there is a class to be defined and anyone outside of that class should have zero influence. Those within the class, of course, should have infinite influence. On that, the two rich media whore starlets agree. Their disagreement, clearly, is on where & how the periphery should be drawn. Who should be in, who should not be.

But the premise of the question, on which they agree, is flawed. It isn’t sustainable. It promises the rest of us nothing, no solutions to any problems, only more fighting. And maybe that’s the whole point.