Johann Bessler And His Perpetual Motion Machine

Johann Bessler was born in Zittau, Germany, in 1680. He died in 1745.

His claim to fame is that, in the 1712, he built a remarkable machine — called the Bessler Wheel — which he said was a machine of self-perpetuated motion; by 1717, “he’d convinced thousands of people, from the ordinary to the most prominent, that he had indeed discovered the secret of a self-sustaining mechanism” (source).

The Bessler Wheel was tested repeatedly and rigorously, and it passed every test laid to it.

For example:

It was made to do heavy work for long periods, and in an official test it ran continuously for 54 days. The internal design of the machine was always closely guarded by its inventor. Plagued by paranoia and a nasty temper and with no patent laws to protect him Bessler destroyed the machines in a fit of anger and took his secret to the grave. The true motive power behind Bessler’s demonstrations, and the energy source which moved the wheel’s internal weights still remain unexplained. Obviously a machine like this violates the law of conservation of energy, which states that energy can never be created or destroyed but it should then be asked how did Bessler fool so many people for so many years? (Ibid)

In 1717, Bessler himself said the following:

The internal structure of the machine is of a nature according to the laws of mechanical perpetual motion, so arranged that certain disposed weights, once in rotation, gain force from their own swinging, and must continue this movement as long as their structure does not lose its position and arrangement.

And one final thing to consider about this subject:

If by “Perpetual Motion Machine” we mean a device that taps into a natural motion and does work indefinitely without human or animal assistance, the problem is not only solvable but has already been solved in a variety of ways:

* Cox Clock — a clock that runs forever on barometric pressure.

* Atmos Clock — a clock that runs forever on small changes in temperature.

* Tidal Power Generator — harnessing the innate gravitational power of celestial bodies.

* Geothermal Power Generator — tapping the energy released when gravity condenses matter.

* Nuclear Breeder Reactor — a machine that produces more fuel than it consumes.

Was Bessler’s machine, like these, somehow attached to the very wheelwork of nature? The mathematician Jean Bernoulli wrote:

“…any motion which exists in nature can be used to support a perpetual motion. In these instances such machines cannot be regarded as purely artificial perpetual motion, but rather as a combined perpetual motion because their motion is assisted by nature. I am convinced that Bessler’s Wheel is of this type.”

(Link)

Legalizing Drugs

Everyone believes in freedom — until everyone finds out what freedom actually means. Then almost no one believes in it.

Freedom means you are left alone; you are neither helped nor hindered. And that’s all it means.

Rightwing politicos and leftwing politicos don’t usually agree on specifics, but they do often agree on principle: namely, that government’s proper sphere of authority does extend beyond protection against the initiation of force.

Humans, say today’s politicians, both right and left, aren’t capable of flourishing without the aid of bureaucrats; so these bureaucrats must help us live our lives for us.

Nowhere is this (unquestioned) conviction made clearer than in the issue of drugs.

Drugs, like prostitution, provide us with a good example of how the rightwing and the left are not fundamentally opposed but merely disagree on superficialities, insofar as both sides agree that not all drugs should be legal.

This notion has been so thoroughly inculcated into the mind’s of Americans that to question its legitimacy at all is considered lunatic-fringe thinking.

True, there are representatives on both sides of the political spectrum who support legalizing marijuana and perhaps a few other drugs. But start talking about legalizing all drugs on principle, or mention doing away with drinking-age laws on principle, and all liquor laws on principle, or speak of legalizing gambling and prostitution in all states and cities — and then you really begin to sort out the men from the boys.

That principle is the principle that it is not within the proper sphere of government to be involved in these aspects of human lives.

If we each possess the right to our own life and only our own life — and we do — then using drugs is obviously the right of each individual. The fact that it has become unquestionable to the majority that we do not possess the right to use drugs is we choose is a sad testament to the power of custom.

It is a sad testament to how people get so used to thinking about something in one way that changing minds becomes absolutely out of the question.

Yet if you believe in freedom, you not only should but must believe in the legalization of all drugs. If you do not, then you do not believe in freedom, and you must choose: freedom or statism.

This point can be made on principle alone, and it is a foolproof argument, the first and strongest line of defense. But it will not satisfy those who believe the proper scope of government does extend into telling us how we may and may not live.

It is frequently argued, for example, by the religious contingent, that if you legalize drugs, the usage of drugs will increase.

“Common sense and common experience tell us this,” says lawyer and radio talk-show host Dan Caplis, incessantly.

Next, we’re offered as evidence that the number of drinkers did increase after prohibition — a statement which is, at best, misleading, and here’s why:

Prior to prohibition, when drinking was still legal, the number of drinkers in this country was on a significant downward trend. For a decade leading up to prohibition, fewer and fewer people were drinking.

This fact is clear and not in dispute. But when, in 1920, the moralizers and busybodies got their way and legislated that the rest of the country must live as they deemed appropriate, and prohibition was then made into law, drinking still continued its downward trend. This went on for about three years.

It is very important to reiterate that the downward trend in drinking began long before drinking had been made illegal.

In the middle of prohibition — when drinking was still illegal — the number of drinkers began gradually to rise.

It continued to do so throughout the rest of prohibition, so that when, in December of 1933, prohibition was finally repealed, that upward trend continued for about a decade. But it was only the continuation of a trend that had already begun while drinking was illegal. This is a critical fact, but one you’ll never hear mention of when you hear people talking about “the number of drinkers increasing after prohibition.”

The next time someone says that “repealing prohibition increased the number of drinkers in this country,” be clear what that means: it means the number of drinkers was already increasing throughout the latter two-thirds of prohibition, and that the upward trend plateaued and then declined a decade after drinking was legalized anew.

Ask yourself also these questions: if, as the religious propound, making substances illegal prevents their usage, how is it that the number of drinkers began rising when alcohol was still illegal?

How is it that in Holland, where many drugs are legal and even subsidized(!), how is it that usage has decreased?

What does this tell us about “common sense and common experience”?

How is it that in Switzerland, marijuana usage has decreased even though it’s been made legal? And Spain?

There are those, of course, who argue that if drugs are legal, crime will increase. This is the biggest canard of them all.

Rest assured, if crime is your concern, illegalization should be what you want done away with.

There exists right now a multi-trillion-dollar underworld built up around illegal drugs, which legalizing would instantaneously crush, and which, as it stands, no amount of law, legislation, or litigation can come close to stopping. Why? The law of supply and demand is unstoppable: if there is a demand for something, supply will meet it, no matter what. All the conservative legislation imaginable cannot negate this fact. One might just as well try legislating against the tide.

When cigarettes and alcohol became so staggeringly taxed, do you know what happened? A gigantic blackmarket swept into the country. That meant more crime. People were smuggling in alcohol and cigarettes because these things could be sold for much cheaper on the blackmarket. They still are to this day.

Decriminalizing brings less crime.

For those who believe that if drugs are legalized, your kids are then more likely to use drugs, I urge you to remember that children have brains. Human beings have brains. We can learn, and we can be educated. We can be taught why not to use drugs. If you doubt the effectiveness of this, observe that cigarettes were legal for any age group until fairly recently, and the number of young smokers was sharply decreasing, and had been since the dangers of smoking were made known. Now that’s it’s illegal, teen smoking is on the rise again, and criminalizing doesn’t help.

Ask any honest school kid if he or she would have trouble getting drugs. Every honest school kid will tell you no. This despite the fact that drugs are illegal.

The inescapable law of supply and demand is why: if there’s a demand, supply will meet it. And no government bureaucracy and no middle-class morality can successfully fight it.

Making something illegal won’t decrease the supply of anything. It will only increase the underworld that provides the supply. This is a economic axiom.

Here’s another:

The only way to decrease supply is to curb demand.

The only way to curb demand is to inform, to educate, to decriminalize.

Each person must choose if he or she wants to use drugs or not, and whether those drugs are legal or illegal has little to do with the choice. There are many things that are legal and that every person has instant access to, but not everyone chooses to partake of. Why so?

The so-called war on drugs is a monumental waste of resources and money; it will continue to be so until the end of time. When something is made illegal, it develops a mystique. It entices. When something is legal, it becomes commonplace and mundane. It becomes no big deal. It is demystified.

Take, for instance, a person who’s grown up in an ultra-sheltered society and compare him or her to a person who’s grown up in the inner-city. Now drop them both off in downtown New York where there’s legal XXX shops on every street corner. Whom do you think will be more curious? And for whom do you think this will be more of a novelty?

And finally, for all the tax-happy liberals out there, think about this: if you legalize drugs, you can tax the living hell out of them. You can then use that tax money to educate with all your half-assed liberal programs, which benefit the “common good.” What more motivation do you need?

It is often said:

“Legalizing pot might be okay, but legalizing cocaine and methadrine, no way. I’ve known wealthy, white-collar, healthy, normal, successful businesspeople who’ve gotten so caught up in amphetamines that they’ve never been able to get off. They died. Suicide. OD. They’ve ruined their lives and the lives of their families. No way you should make these drugs legal.”

This is a repackaged version of the legalizing-creates-more-usage argument. It’s the same argument that drugs shouldn’t be legal because look at all the children born severely retarded and deformed because the mothers used crack throughout the pregnancy.

The first thing we must obviously note here is that all this happened (and still happens) even though drugs are illegal. Observe that making them illegal did not prevent these things from happening. Now ask yourself why.

Remember also that cigarettes and alcohol have ruined more lives and more families by far than every amphetamine combined. Should we therefore make alcohol and cigarettes illegal? And if not, why not? If it’s within the proper jurisdiction of government to run our lives, why shouldn’t we illegalize them?

And why, if that is government’s legitimate jurisdiction, draw the line at amphetamines, alcohol, and cigarettes? Why not let government run everything we consume — be it bacon, beer, or brats?

When gin made it into mainstream London, should it have been illegalized because it created such staggering addiction rates and ruined so many thousands of families?

We often hear: since alcohol can be and often is used in moderation, it should therefore be legal, whereas drugs cannot be used in moderation, and so should be illegal.

Leaving aside the questionable verity of such statements, since when did moderation become the standard for legalization versus illegalizing? That means, then, among other things, that for all those who can’t use alcohol or tobacco in moderation — for all, in other words, who are addicted (roughly half of all drinkers and more than ninety-five percent of all tobacco users) — these substances should be illegal? But for the rest, fine?

Freedom means you are left alone. It means you are neither helped nor hindered.

In this country, as in any just country, government’s proper role is not to be proscriptive or preventative.

In the words of Frederic Bastiat (1801 – 1850):

The nature of law is to maintain justice. There is in all of us a strong disposition to believe that anything lawful is also legitimate. This belief is so widespread that many persons have erroneously held that things are ‘just’ because the law makes them so (Frederic Bastiat, The Law).

Does Exercise Really Promote Weight Loss?

There’s an old joke lumberjacks still love to tell:

“Why did the train stop?”

Answer: “To let the lumberjack off.”

This quip was coined around the same time that a famous study was conducted. It was a study that measured the caloric intake of lumberjacks, whose appetites are about as notorious as the size of their logs.

It turns out that the caloric intake of a lumberjack is, on average, about 5,000 calories per day.

By comparison, this same study measured the caloric intake of tailors. Tailors, it turns out, consume on average half that: 2,500 calories per day.

It was found in addition that those who change their occupation from light to heavy work, or vice-versa, develop corresponding changes in appetite.

All of which is by way of saying that physical activity makes you hungry. Not exactly news, and yet if it’s followed to its conclusion, the ramifications run deep.

The relationship between weight loss and exercise is a complex relationship, and no matter what anyone tells you, it is not well-understood.

Furthermore, despite prevailing wisdom, despite what you’ve been hammered with all your life, there’s not a shred of real evidence that suggests exercise promotes significant weight loss. As a matter of fact, at one time not so very long ago — up until 1962, to be precise — the medical prescription for obesity was bed rest.

An obesity and diabetes specialist named Russell Wilder, of the Mayo Clinic, lectured famously in 1932 on obesity. Among other things, Mr. Wilder told us that his “fat patients lost more weight with bed rest,” while “unusually strenuous physical exercise slows the rate of weight loss” (Russell Wilder, 1932).

As Wilder and his colleagues reckoned it, “Light exercise burns an insignificant number of calories — amounts that are undone by comparatively effortless changes in diet.”

A University of Michigan researcher named Louis Newburgh calculated, in 1942, that the average man “expends only three calories climbing a flight of stairs. He will have to climb 20 flights of stairs to rid himself of the energy contained in one slice of bread.”

Why then, ask some, don’t we simply skip the stairs and skip the bread? It’s a good question.

These physicians argued that the more taxing the physical activity, the more that the appetite increases. And study after study, beginning with those conducted on our previously mentioned lumberjacks and tailors, confirm this.

“Vigorous muscle exercise usually results in immediate demand for a large meal,” said Hugo Rony (not to be confused with Rice-a-Roni, the San Francisco treat), in a 1940 textbook titled Obesity and Leanness. “Consistently high or low energy expenditures result in consistently high or low levels of appetite. Thus men doing heavy physical work spontaneously eat more than men engaged in sedentary occupations.”

Mr. Rony here goes on to speak of our flapjack-eating lumberjacks, and ends, curiously enough, by asking the same question these men repeatedly asked him:

“Why did the train stop?”

Of course, the real question is not why the train did or didn’t stop, but why we’ve come to believe — and believe so overwhelmingly — the exact opposite of what was once the prevailing medical view?

Credit for that belongs to one Jean Mayer, initially of Harvard University, who then went on to become America’s most influential nutritionist.

As an authority on human-weight regulation, Mayer was among the very first of a new breed, a type that has since come to dominate the field. His predecessors — Wilder, Rony, Newburgh and others — had all been physicians who worked closely with obese and overweight patients. Mayer was not. His training was in physiological chemistry; he had obtained a doctorate at Yale with a dissertation on the interrelationship of vitamins A and C in rats. In the ensuing decades, he would publish hundreds of papers on different aspects of nutrition, including why we get fat, but he never had to reduce obese patients as part of his clinical obligation, and so his hypotheses were less fettered by anecdotal or real-life experience.

As early as 1953, after just a few years of research on laboratory mice, Mayer began extolling the virtues of exercise for weight control. By 1959, the New York Times was crediting him with having “debunked the popular theories” that exercise played little role in weight control. Mayer knew the obese often eat no more than the lean and occasionally even less. This seemed to exclude gluttony as a cause of their weight gain, which meant that these fat people had to be less physically active. Otherwise, how could they take in more calories than they expend and so become fat?

Through the sixties, Mayer documented the relationship between inactivity and the overweight. He noted that fat high-school girls ate “several hundred calories less” than lean classmates. “The laws of thermodynamics were, however, not flouted by this finding,” he wrote, because the obese girls expended less energy than the lean. They were much less active; they spent four times as many hours watching television. Mayer also studied infants. “The striking phenomenon is that the fatter babies were quiet, placid babies that had moderate intake,” Mayer reported, “whereas the babies who had the highest intake tended to be very thin babies, cried a lot, moved a lot, and became very tense.” Thus, Mayer concluded, “some individuals are born very quiet, inactive, and placid and with moderate intake get fat, and some individuals from the very beginning are very active and do not get particularly fat even with high intakes” (Gary Taubes, “We Can’t Work it Out”).

Jean Mayer pioneered the exercise and weight-loss practices that many people today consider axiomatic.

Jean Mayer cited “sedentary living” as the “most important factor” in obesity, and, for that matter, all other adverse health conditions appertaining thereunto.

“Modern people,” said Mayer, “are inert compared with their ancestors [who were] constantly engaged in hard physical labor…. The development of obesity is to a large extent the result of the lack of foresight of a civilisation [sic] which spends billions annually on cars, but is unwilling to include a swimming pool and tennis courts in the plans of every school” (Jean Mayer, 1968).

At that time, many doctors and nutritionists disagreed with Mayer’s pronouncements; and even now, a number of very reputable scientists still do.

“It is a common observation that many obese persons are lazy, i.e. they show decreased impulse to muscle activity. This may be, in part, an effect that excess weight would have on the activity impulse of any normal person” (Rony, 1941).

But isn’t it equally possible that obesity and physical inactivity are symptoms of the same cause?

And isn’t it obvious that the more physically active we are, the hungrier we get?

Mayer’s voracious attack on hunger completely masked the logical inconsistencies his arguments contain.

He did at one point acknowledge that “exercise could make us hungrier,” but in the same breath added “It wasn’t necessarily the case.”

This was the crux of Mayer’s nutritional philosophy.

He alleged a gap in the relationship between appetite and physical activity.

“If,” said Mayer, “exercise is decreased below a certain point, food intake no longer decreases. In other words, walking 30 minutes a day may be equivalent to four slices of bread, but if you don’t walk the half-hour, you still want to eat the four slices.”

This is untrue. And it’s the fatal flaw in his theory. As the lumberjack-tailor study makes very clear, physical activity has a direct and significant bearing on appetite.

And yet from his faulty premise, Mayer, unaware that he was upending the existing worldview on weight loss, wattled forward.

He based this conclusion on two (and only two) of his own studies from the mid-Fifties. The first purported to demonstrate that laboratory rats exercised for a few hours every day will eat less than rats that don’t exercise at all. But this was never replicated. In more recent experiments, the more rats run the more rats eat; weight remains unchanged. And when rats are retired from these exercise programmes, [sic] they eat more than ever and gain weight with age more rapidly than rats that were allowed to remain sedentary. With hamsters and gerbils, exercise increases body weight and body-fat percentage. So exercising makes these particular rodents fatter, not leaner.

Mayer’s second study was an assessment of the diet, physical activity and weights of workers and merchants at a mill in West Bengal, India. This article is still commonly cited as perhaps the only existing evidence that physical activity and appetite do not necessarily go hand in hand. But it, too, has never been replicated, despite (or perhaps because of) a half-century of improvements in methods of assessing diet and energy expenditure in humans. It helped that Mayer promoted his pro-exercise message with a fervor akin to a moral crusade (Gary Taubes, “We Can’t Work it Out”).

In 1977, coinciding with Mayer’s crusade, the New York Times spoke of the “exercise explosion” that had come about because the conventional wisdom of the sixties that exercise was “bad for you” had been transformed into the “new conventional wisdom — that strenuous exercise is good for you.”

The Washington Post as well estimated that “100 million Americans were partaking in the new fitness revolution” — coincident with the start of the current obesity epidemic.

Still, no matter how many billions believe it, the evidence that exercise promotes weight loss has simply never been produced.

My favorite study of the effect of physical activity on weight loss was published in 1989 by a team of Danish researchers. Over the course of 18 months the Danes trained non-athletes to run a marathon. At the end of this training period, the 18 men in the study had lost an average of 5lb of body fat. As for the nine women subjects, the Danes reported, ‘no change in body composition was observed’. That same year, F Xavier Pi-Sunyer reviewed the studies on exercise and weight, and his conclusion was identical to that of the Finnish review’s 11 years later: ‘Decreases, increases, and no changes in body weight and body composition have been observed,’ Pi-Sunyer reported (Ibid).

Here’s the main thing to realize: the relationship between exercise and diet is a complicated relationship, but the chemistry behind weight loss is not complicated:

To lose weight, you must simply use more calories per day than you take in. That’s it.

All the hype and all the fad diets and all the panaceas in the world won’t change that. Exercise does burn calories (even if it’s not quite as many as people think), but it also dramatically increases appetite. Thus, as often as not, exercise tips the scales in the wrong direction.

That’s the fact, Jack.

Metaphysics: Theory of Everything

Reality is existence, and existence is everything. Every theory of everything must start there.

There’s existence, and there’s essence. These two things are separate but not separable.

In the language of Thomas Aquinas, esse (or essence) is identity: To be, in other words, is to be something.

The conclusion is inescapable because (as Aristotle noted) the only alternative to that which exists is that which does not exist. But that which does not exist doesn’t exist.

“There is no nothing,” said Victor Hugo.

Nothing, by definition, is not something.

The only alternative to reality, therefore, is unreality, which, as the very word implies, is not real — i.e. which isn’t.

These principles form the fundamental laws of metaphysics — metaphysics being the study of ultimate reality (meta for “beyond” and physics for “physical reality”).

New-Age pseudo-philosophy has unfortunately bastardized and perverted the term metaphysics, but please don’t be duped. Nothing is more important than metaphysics. It is the highest part of philosophy, the part from which all others derive, the science of “being as being.”

The universe (paraphrasing Thomas Aquinas) is the sum of everything that exists. That’s what the universe is. That’s not what it may be, and that’s not what some people might think. That’s what the universe actually is.

The universe is everything. There can thus not be “the possibility of many universes,” as many modern physicists would have us believe.

Nor is there anything “beyond the universe”:

If something exists, it is by definition part of the universe.

If it does not exist, it does not exist.

Metaphysically, the fact of existence is the peg upon which the entirety of human knowledge hangs.

Without it, knowledge degenerates into a buccal-fecal carnival of solipsism, skepticism, postmodernism, and relativism.

The proper defense of independent reality is as follows:

Any attempt to deny existence refutes itself at the outset, because even the barest, most laconic denial of existence implies some kind of existence.

Quoting the man Dante called “the master of him who knows”:

“Why a thing is itself” is a meaningless inquiry, for the fact or existence of the thing must already be evident … but the fact that a thing is itself is the single reason and the single cause to be given to all such questions as “why is man man” or “the musician musical” (Aristotle, Metaphysics 7.16.1041a15-18).

And again:

He who examines the most general features of existence, must investigate also the principles of reasoning. For he who gets the best grasp of his respective subject will be most able to discuss its basic principles. So that he who gets the best grasp of existing things qua existing must be able to discuss the basic principles of all existence; and he is the philosopher. And the most certain principle of all is that about which it is impossible to be mistaken… It is clear, then, that such a principle is the most certain of all and we can state it thus: “It is impossible for the same thing at the same time to belong and not belong to the same thing at the same time and in the same respect” (Aristotle, Metaphysics, 1005b12-20).

In support of which, his pupil, Thomas Aquinas, added this:

Nature is what we call everything that can in anyway be captured by the intellect, for a thing is not intelligible except through its definition and essence…. All around us are existing things. They are certainly different, but they all exist.

Metaphysically, then, the facts are these:

Existence is everything.

There is no nothing.

Existence is reality.

Reality is what’s real.

Nature is reality.

The universe is everything.

Nature is the universe.

There is no “super-nature.”

All else proceeds from that.

Nuclear Waste Doesn’t Exist

There is no such thing as nuclear waste — and that’s just one of the many beautiful things about nuclear energy.

A nuclear reactor is refueled by its waste.

Quoting Dr. Pierre Guelfe, chief engineer of France’s main nuclear facility, in an interview he gave with William Tucker, author of an excellent book called Terrestrial Energy:

Pierre Guelfe: When the depleted fuel rods are removed, the reactors are shipped to La Hague for reprocessing. They let it cool down for a few years and then remove the uranium and plutonium. They ship the plutonium here. We take it and mix it with another stream of material, which is the scrap that is left over from uranium enrichment. The U235 content of this is very low … U235 is the fissionable isotope. But the plutonium is much more fissionable than the depleted uranium. So when we mix them together, you get a fuel that is very close to enriched uranium. It’s called ‘Mixed Oxide Fuel’(MOX). We have 20 reactors here in France running on MOX and there are ten more in Germany and two in Switzerland. So we’re pure plutonium, and we scrap uranium together. We use everything. We don’t leave any waste.

William Tucker: I’ve read this several times but I want to make absolutely sure: The plutonium that comes out of a commercial reactor, that you separate from the fuel rod, that cannot be used to make a bomb, right?

Pierre Guelfe: That’s right. You have four plutonium isotopes: Pu239, Pu240, Pu241 and Pu242. Of the four, only Pu239 can sustain a chain reaction. The others are contaminants. The PU241 is too highly radioactive. It fissiles too fast so you can’t control it to make a bomb. But you can use all of them to sustain fission in a MOX reactor (source).

And yet on the basis of some colossal misinformation, the United States now has fifty thousand tons of nuclear “waste,” because our government won’t allow nuclear plants to reuse it.

The stated policy of the Department of Energy (DOE) is “not to reprocess” a perfectly reusable by-product — and all for absolutely no good reason.

That, as I discuss in Chapter 12 of my book, is why Yucca Mountain is unnecessarily, and at great cost, being built in southwestern Nevada: to store a nuclear “waste” that could instead be simply and efficiently reused.

Nuclear “waste,” incidentally, is also used for medical isotopes. In fact, over 40 percent of medicine now is nuclear medicine. Currently, we must import all our nuclear isotopes because we’re not allowed to use any of our own.

This is not only profligate.

It’s a kind of lunacy.

Just for the record: Barack Obama was opposed to nuclear energy before he was for it, presumably because, unlike wind and solar, nuclear energy actually works, and is more efficient by far than any other alternative.

It seems well worth mentioning also that as a direct result of environmentalism’s pathological antipathy toward nuclear energy, these same environmentalists have thereby brought the world 400 million more tons of coals used per year, ever since 1976, when the nuclear reactor at Three Mile Island melted down. Naturally, because environmentalists can’t be bothered by facts, it went completely unnoticed that the containment vessel at Three Mile Island had done its job and prevented any significant release of radioactivity.


Is Charles Bukowski A Great Artist?

Charles Bukowski

A reader writes:

Dear Ray Harvey: Is Charles Bukowski a great artist?

— Billy Badass

Dear Billy Badass: No, he’s not. Bukowski is too sloppy to be a great artist. He lacks vision. He lacks depth and he lacks focus. Reading him, one is reminded of Truman Capote’s criticism of Jack Kerouac’s On the Road: “That’s not writing; it’s typing.”

Here’s what I regard as a quintessential Bukowski poem:

A RADIO WITH GUTS

it was on the 2nd floor on Coronado Street
I used to get drunk
and throw the radio through the window
while it was playing, and, of course,
it would break the glass in the window
and the radio would sit there on the roof
still playing
and I’d tell my woman,
“Ah, what a marvelous radio!”
the next morning I’d take the window
off the hinges
and carry it down the street
to the glass man
who would put in another pane.
I kept throwing that radio through the window
each time I got drunk
and it would sit there on the roof
still playing-
a magic radio
a radio with guts,
and each morning I’d take the window
back to the glass man.
I don’t remember how it ended exactly
though I do remember
we finally moved out.
there was a woman downstairs who worked in
the garden in her bathing suit,
she really dug with that trowel
and she put her behind up in the air
and I used to sit in the window
and watch the sun shine all over that thing
while the music played.

What do you think?

I think Bukowski does possess something — which is to say, he’s not devoid of skill or talent, and I admire his devotion to the sacred art of literature. For one, as you can glimpse above, he does possess a genuine sense for beauty, though it doesn’t consistently come through. And I think he’s genuinely intelligent. His all-time favorite movie was Eraserhead by David Lynch, which I think is interesting. But in the end, Bukowski’s talent remains largely fallow. I like drunken tortured writers as much as anyone, and I am as susceptible as anyone to the romantic quality we see and sense in that sort of writer. Still, I regard Bukowski as over-focused on alcohol and therefore limited.

In short: If you’ve read a few Bukowski poems or stories, you’ve pretty much read them all, and a little Bukowski goes a very long way.

Here’s Bukowski at his best (and I do love this movie which he wrote):

Environmentalism: Cult Of Death


The following is excerpted from Chapter 10 of my book.

Environmentalism, with its attendant army of politicos all armed to the teeth with environmental laws, is, let us make no mistake, the highroad to hell.

Before going all the way green, I urge you to take a longer look into exactly what horse you’re backing here: it may well turn out to be a horse of an entirely different color than you think.

Environmentalism is a philosophy that upholds a profound hatred of humankind:

“Human beings, as a species, have no more value than slugs” (John Davis, editor of Earth First! Journal).

“Mankind is a cancer; we’re the biggest blight on the face of the earth” (president of PETA and environmental activist Ingrid Newkirk).

“If you haven’t given voluntary human extinction much thought before, the idea of a world with no people in it may seem strange. But, if you give it a chance, I think you might agree that the extinction of Homo Sapiens would mean survival for millions, if not billions, of Earth-dwelling species…. Phasing out the human race will solve every problem on earth, social and environmental” (Ibid).

Quoting Richard Conniff, in the pages of Audubon magazine (September, 1990): “Among environmentalists sharing two or three beers, the notion is quite common that if only some calamity could wipe out the entire human race, other species might once again have a chance.”

Environmental theorist Christopher Manes (writing under the nom-de-guerre Miss Ann Thropy): “If radical environmentalists were to invent a disease to bring human population back to ecological sanity, it would probably be something like AIDS.”

Environmental guru “Reverend” Thomas Berry, proclaims that “humans are an affliction of the world, its demonic presence. We are the violators of Earth’s most sacred aspects.”

A speaker at one of Earth First!’s little cult gatherings: “Optimal human population: zero.”

“Ours is an ecological perspective that views Earth as a community and recognizes such apparent enemies as ‘disease’ (e.g., malaria) and ‘pests’ (e.g., mosquitoes) not as manifestations of evil to be overcome but rather as vital and necessary components of a complex and vibrant biosphere … [We have] an antipathy to ‘progress’ and ‘technology.’ We can accept the pejoratives of ‘Luddite’ and ‘Neanderthal’ with pride…. There is no hope for reform of industrial empire…. We humans have become a disease: the Humanpox” (Dave Foreman, past head of Earth First!)

“Human happiness [is] not as important as a wild and healthy planet. I know social scientists who remind me that people are part of nature, but it isn’t true. Somewhere along the line we … became a cancer. We have become a plague upon ourselves and upon the Earth…. Until such time as Homo Sapiens should decide to rejoin nature, some of us can only hope for the right virus to come along.” (Biologist David Graber, “Mother Nature as a Hothouse Flower” Los Angles Times Book Review).

“The ending of the human epoch on Earth would most likely be greeted with a hearty ‘Good riddance!'”(Paul Taylor, “Respect for Nature: A Theory of Environmental Ethics”).

“If we don’t overthrow capitalism, we don’t have a chance of saving the world ecologically. I think it is possible to have an ecologically sound society under socialism. I don’t think it is possible under capitalism” (Judi Bari, of Earth First!).

“Isn’t the only hope for the planet that the industrialized civilizations collapse? Isn’t it our responsibility to bring that about?” (Maurice Strong, Earth Summit 91).

David Brower, former head of the Sierra Club and founder of Friends of the Earth, calls for developers to be “shot with tranquilizer guns.”

Why?

“Human suffering is much less important than the suffering of the planet,” he explains.

Also from the socialist Sierra Club: “The goal now is a socialist, redistributionist society, which is nature’s proper steward and society’s only hope.”

Quoting the Green Party’s first Presidential candidate Barry Commoner:

“Nothing less than a change in the political and social system, including revision of the Constitution, is necessary to save the country from destroying the natural environment…. Capitalism is the earth’s number one enemy.”

From Barry Commoner again:

“Environmental pollution is a sign of major incompatibility between our system of production and the environmental system that supports it. [The socialist way is better because] the theory of socialist economics does not appear to require that growth should continue indefinitely.”

So much for your unalienable right to life, liberty, and the pursuit of happiness. Indeed:

“Individual rights will have to take a back seat to the collective” (Harvey Ruvin, International Council for Local Environmental Initiatives, Dade County Florida).

Sierra Club cofounder David Brower, pushing for his own brand of eugenics:

“Childbearing [should be] a punishable crime against society, unless the parents hold a government license. All potential parents [should be] required to use contraceptive chemicals, the government issuing antidotes to citizens chosen for childbearing.”

That, if you don’t know, is limited government environmentalist style.

“There’s nothing wrong with being a terrorist, as long as you win. Then you write history” (Sierra Club board member Paul Watson).

Again from Paul Watson, writing in that propaganda rag Earth First! Journal: “Right now we’re in the early stages of World War III…. It’s the war to save the planet. The environmental movement doesn’t have many deserters and has a high level of recruitment. Eventually there will be open war.”

And:

“By every means necessary we will bring this and every other empire down! Mutiny and sabotage in defense of Mother Earth!”

Lisa Force, another Sierra Club board member and quondam coordinator of the Center for Biological Diversity, advocates “prying ranchers and their livestock from federal lands. In 2000 and 2003, [Sierra] sued the U.S. Department of the Interior to force ranching families out of the Mojave National Preserve. These ranchers actually owned grazing rights to the preserve; some families had been raising cattle there for over a century. No matter. Using the Endangered Species Act and citing the supposed loss of ‘endangered tortoise habitat,’ the Club was able to force the ranchers out” (quoted from Navigator magazine).

It is a sad fact for environmentalists that in free societies, humans are allowed to trade freely.

Among other things, the right to private property means: that which you produce is yours by right.

Private property is the crux of freedom: you cannot, in any meaningful sense, be said to be free if you are not allowed to use the things that you own, including those things necessary to sustain your life. Everything you need to know about a political ideology is contained in its attitude toward property.

It comes as no surprise therefore to learn that “private property,” in the words of one environmental group, “is just a sacred cow” (Greater Yellowstone Report, Greater Yellowstone Coalition.)

That is also known as socialism.

In 1990, a man named Benjamin Cone Jr. inherited 7,200 acres of land in Pender County, North Carolina. He proceeded to plant chuffa and rye for wild turkeys; he conducted controlled burns on his property to improve the habitat for deer and quail. And he succeeded: in no time, that habitat flourished. Inadvertently, however, he attracted a number of red-cockaded woodpeckers, a species listed as endangered. He was warned by a certain governmental agency that, on threat of imprisonment or stiff fines, he was not allowed to disturb any of these trees, which were all on his property. This put 1,560 acres of his own land off-limits to him, the owner. In response, Benjamin Cone Jr. began clear-cutting the rest of his land, saying: “I cannot afford to let those woodpeckers take over the rest of my property. I’m going to start massive clear-cutting.” (Richard L. Stroup, Eco-nomics p. 56-57.)

Socialist Eric Schlosser, author of the embarrassing Fast Food Nation, makes no secret of his statist agenda. As Doctor Thomas DiLorenzo points out, Schlosser lauds the “scientific socialists” (a generic term coined by comrade V.I. Lenin) and everything they stand for: government intervention and bureaucracy, public works, job-destroying minimum wage laws, OSHA regulations, food regulations, regulatory agencies to control ranching, farming, and supermarkets, bans on advertising and much more. Only then, he says, will that great day come when restaurants exclusively sell “free-range, organic, grass-fed hamburgers” (Fast Food Nation: The Dark Side of the All-American Meal).

All of which is simply by way of saying that individual consumers should not be allowed to choose what we want to eat, and that the supply of free-range hamburgers should not be determined by demand. Rather, by law, government bureaucrats must do this for us, regardless of whether we personally want to eat organic, grass-fed beef.

Colorado congressman Scott McInnis confessed that four firefighters burned to death in Washington state because bureaucrats took 10 hours to approve a water drop. The reason: using local river water is prohibited by the Endangered Species Act, on the grounds that it may threaten a certain kind of trout.

Further proof of the Sierra’s hatred of humanity can be found in their 1995 attempt to block an Animas River water diversion project, which project was designed to bring water to Durango and the nearby Ute Indian Reservation.

Dams and irrigation are often life-and-death matters in the arid west, a fact of which Sierra is well aware. Thus, after successfully getting the project slashed by more than 70 percent, thereby depriving the Ute Reservation of much-needed water, the Sierra Club lawyers went for the jugular: they demanded the project be cut still more.

Fortunately for the rest of us, they overplayed their hand.

Their shady methods and motives prompted the following quote from Senator Ben Nighthorse Campbell:

“The enviros have never been interested in a compromise. They just simply want to stop development and growth. And the way you do that in the West is to stop water.”

From a chairwoman of the Ute Indian tribe: “The environmentalists don’t seem to care how we live.”

Greenpeace is worldwide the largest and wealthiest environmental group. Of their co-founder Dave McTaggart, fellow co-founder Paul Watson said this:

“The secret to David McTaggart’s success is the secret to Greenpeace’s success: It doesn’t matter what is true, it only matters what people believe is true. You are what the media define you to be. Greenpeace became a myth, and a myth-generating machine.”

And since rather than addressing the actual data, environmentalists believe that citing the source of funding is the only argument one ever needs to refute a counterargument, environmentalists should be extraordinarily persuaded by this very partial list of Greenpeace’s funding.

Most people have no inkling that throughout Greenpeace’s tireless campaign against “Frakenfood” (i.e. biotech food – “Frakenfood” is a word coined by Greenpeace campaign director Charles Margulisto, who hates technology), the Third World has steadily perished from malnutrition and famine, as a direct result thereof.

Quoting Tanzania’s Doctor Michael Mbwille (of the non-profit Food Security Network):

“Greenpeace prints and circulates lies faster than the Code Red virus infected the world’s computers. If we were to apply Greenpeace’s scientifically illiterate standards [for soybeans] universally, there would be nothing left on our tables.”

(For an example of how to successfully expose Greenpeace’s lies, please read this relevant article.)

Candidly, I haven’t even begun.

And yet from this small sampling, you can probably get an idea of what an exceptionally gracious and non-politically motivated folk these environmentalists and environmental leaders are. Indeed, environmentalism is a benevolent and life-affirming philosophy, and the people who populate it are a kind, non-violent people, whose reasoning is sound and scrupulous, and who believe unreservedly in the individual’s inalienable right to life and property.

There is of course only one real problem with all that: these people are hypocrites, and environmentalism worships at the shrine of death.

The entire movement, replete, as it is, with its politicos and environmental politics, is not simply “wrong.” That would be too easy.

The environmental movement is criminal.

Reader, if you have even a vestigial love of freedom within you, you must denounce environmentalism with all your heart. You must see it for what it actually is: a statist philosophy of human-hatred and enslavement.

Environmentalism is neo-Marxism at its blackest.

More here on the toxicity of environmentalism.

Is Shakespeare All That?

A reader writes:

Dear Ray Harvey: Is Shakespeare all that?

— Slo Readuh

Dear Slo Readuh: No, he’s not all that. He’s all that and more.

It’s impossible to overstate Shakespeare’s genius. Forget that his plots were largely borrowed; forget that he never created a major character who didn’t have significant flaws. None of that is where Shakespeare’s genius lies. As Vladimir Nabokov — who was at times (not consistently) an insightful critic — once expressed it: “The verbal poetical texture of Shakespeare is the greatest the world has ever known, and is immensely superior to the structure of his plays as plays. With Shakespeare, it is the metaphor that’s the thing, not the play.”

Aristotle believed the creation of metaphor to be the highest mark of literary genius. Shakespeare’s metaphoric-poetic skill is virtually bottomless. Even if you were to take away his plays, he’d still rank as one of the greatest sonneteers of all time.

It is, I admit, a tragedy that the Elizabethan language has become to our modern eyes and ears so difficult. Much of Shakespeare does indeed require footnotes, which of course can make for some very rough going. I do understand that. Yet when you push past that — which is to say, when you begin at last to penetrate Shakespeare — you’ll glimpse something you can’t believe.

John Keats was not speaking hyperbolically when he said that Shakespeare “smacks of the divine.” Nor was Samuel Taylor Coleridge when he said that Shakespeare “possessed myriad minds.” Shakespeare’s words contain the kind of truth that seems otherworldly. Neither is it an accident that Shakespeare is quoted more widely than the Bible.

But it was Herman Melville, who thought Shakespeare “the profoundest of thinkers,” that captures the locus of his genius most appositely of all. Melville said that Shakespeare was “master at the art of telling truth even though it be covertly, and by snatches…. It is those deep far-away things that make Shakespeare, Shakespeare.”

Like Nabokov, Melville believed that it is neither the tragedies nor the comedies that make Shakespeare great: it’s the insights into humanity, which come at you constantly from among his plays and sonnets “like flashes of lightning illuminating the mysteries below.”

That’s not all: Shakespeare’s greatest living admirer (and arguably the world’s best-read human being) the critic Harold Bloom (not to be confused, as he so often is, with that hack Howard Bloom), honestly believes that in creating so many convincing characters, Shakespeare went far in creating our modern-day conception of humanity itself. It is an incredible statement, and yet I, for one, won’t argue it. In Harold Bloom’s own words:

“Shakespeare, who at the least changed our ways of presenting human nature, if not human nature itself, does not portray himself anywhere in his plays.”

Even more interestingly, perhaps, Mr. Bloom goes on to say this:

If I could question any dead author, it would be Shakespeare, and I would not waste my seconds by asking the identity of the Dark Lady or the precisely nuanced elements of homoeroticism in the relationship with Southhampton (or another). Naively, I would blurt out: did it comfort you to have fashioned woman and men more real than living men and women? (Harold Bloom, Genius, p. 18).

The profundity of his question is, I believe, the truest test of Bloom’s sincerity, for that question is almost frighteningly subtle — and as a writer of fiction, I can personally testify that there is much to that: I am deeply comforted by the women and men I fashion. But apart from the insight it provides us into Bloom’s own brain, it illustrates something even more significant:

It’s long been observed that one of the best measures of literature is when you can discuss the characters of a story or play as if those characters were real people: when you can talk about their personalities; when you can psychologize over them, their choice of careers, their deeds; when you can pick their brains and discuss their addictions and predispositions as if these characters were actual human beings. Many playwrights and novelists, and even many modern day screenplay-writers, have created characters that meet this criteria. But no one — and I mean no one — has come close to creating the sheer number of these characters that Shakespeare did. Love these people or hate them, Shakespeare brought to life a gallery of women and men who are completely human — and he did it in a language whose prosody for practitioners still astounds.

That is the real power of art, and of Shakespeare.

So no, gentle Slo Readuh, Shakespeare is not overrated. He is, if anything, vastly underrated.

How The American Healthcare Crisis Began

Staff of Asclepius, symbol of healing

What is now termed modern medicine actually began in the early 1920s when science — in particular, germ theory — culminated to a point that sickness and disease were at last being treated reliably.

It was then that doctors and hospitals got much better at the business of saving lives. This more highly developed service and expertise raised the value of their work, and they charged accordingly for their increased skill and labor.

And that, really, is when the situation started: when lives can be saved and health can be gained because of developments in technology, everyone suddenly believes that it’s his or her right to have that thing.

We see the same principle at work in, for example, the platitude “No one should go hungry when Americans are throwing away food.”

The error in both cases is the fraudulent notion that survival should be assured. This notion neglects the singular fact that abundance and technology are produced — and produced, moreover, by individuals.

No one has the right to the life and labor (i.e. production) of any individual, including the life and labor of doctors.

An easy way to demonstrate this truth is by asking the following: where was that right before these goods and services were produced or invented?

It is a fact that American medicine is already 50 percent socialized.

It is also a fact that there’s a clear correlation between rising healthcare costs and the socialization of medicine in this country. More government intervention will only compound the problem.

In the 1920s, when advancing healthcare became more expensive (though still very reasonable), the administrator of Baylor Hospital in Dallas, one Dr. Justin Ford Kimball, created a system called Blue Cross. The Blues (so-called) were nonprofit health insurers. They served local organizations like the Rebeccas and the Elks Club, and — please pay attention — they kept their premiums low in exchange for tax breaks.

Tax breaks are one of the main components to our current healthcare crisis. They’re what initially created the problem.

Blue Cross, you see, was successful only because of these tax breaks. Up until then, commercial insurers had always regarded medicine as a mediocre market, and therefore commercial insurers didn’t deal too much in medicine. But when commercial insurers saw that the Blues were making money, it convinced them to enter the medical field. This was not a problem, at first — until the 1940s, when private insurers increased their efforts to get around wartime wage controls, thus:

During World War II, Franklin Delano Roosevelt’s price-and-wage people, who didn’t generally permit wage increases or price increases (regardless of market forces) sanctioned a form of tax discrimination: specifically, they allowed employers to pay for employee medical insurance with pretax dollars.

This quickly became one of the few ways employers could attract new and better employees, since FDR had actually mandated that employers were no longer permitted to pay out higher wages. (How this ridiculous idea came about is another story, for another time.)

To this day, those who get employer-financed healthcare are purchasing their healthcare coverage with pretax dollars. On the other hand, those who buy their own healthcare are purchasing it with after-tax dollars.

This is a much bigger issue than you might at first realize.

As far as the employer was (initially) concerned, this wasn’t any different from additional labor costs — which is to say, medical insurance was not, from the employers perspective, any different from a rise in wages, and yet FDR’s price-and-wage control people did not at all see it as a wage increase. They therefore allowed it, which may seem surprising in light of FDR’s desire to control the entire economy.

Likewise, the IRS bureaucrats under FDR did not regard this maneuver as a wage increase, and for this reason they didn’t slap a tax on it. Neither did the employees see it as a real raise in wages — a fact that is singular to how this whole horrible precedent was set — because these costs are what economists call hidden costs.

The upshot: people didn’t and very often still don’t know that it is, after all, their own money paying for this prepaid medical coverage, and that medical coverage isn’t free.

In fact, health insurance today isn’t even really health insurance. It’s more properly called prepaid healthcare. But — and this is an another crux — it gives the appearance of being free or substantially free to the user, and it therefore substantially increases the demand for it and therefore its cost.

Of course, the root of this whole problem is the misbegotten notion that healthcare is not a good and service to be traded on the open market, but a right.

Let us remember what insurance actually is:

Insurance, properly defined, is what you purchase in order to avoid financial ruin in the case of a rare emergency.

Under the dangerous system FDR created, employees came to regard their healthcare coverage as a kind of blessed phenomena which came without cause or consequence. Quickly, this phenomena was absorbed into the working culture and as quickly was taken for granted: employees got used to receiving free goods, which goods, however, were not actually free. Employees just could not see that they were paying for them, and paying for them, furthermore, with pretax dollars.

A family in the bottom fifth of the income distribution pays about $450 more in taxes than insured families at the same income level. For families in the top fifth of the income distribution, the tax penalty is $1,780. On average, uninsured families pay about $1,018 more in federal taxes each year because they do not have employer-provided insurance. Collectively, the uninsured pay about $17.1 billion in extra taxes each year because they do not receive the same tax break as insured people with similar income. If state and local taxes are included, the extra taxes paid by the uninsured exceed $19 billion per year (“Are the uninsured freeloaders?” National Center for Policy Analysis, Brief Analysis No. 120).

Among other things, this illustrates again why entitlements are such a deadly precedent: once they’re entrenched, it’s virtually impossible to retrogress. Why? Because people acclimate to entitlements and in no time cannot imagine life without them.

WD-40

WD-40 is a uniquely American invention, created in 1953 by three technicians at the San Diego Rocket Chemical Company. The name WD-40 derives from a project the goal of which was to find a water displacement compound. It took them 40 tries. WD-40 stands for Water Displacement #40.

Initially, the main purpose of WD-40 was to use it as a rust preventative solvent and degreaser to protect missile parts. The Convair Company bought it in bulk to protect their atlas missile parts.

Ken East, one of the original founders, maintains to this very day that there is nothing in WD-40 that would hurt you if ingested.

Here are some other uses of WD-40:

1. Protects silver from tarnishing.

2. Removes road tar and grime from cars.

3. Cleans and lubricates guitar strings.

4. Gives floors that just-waxed sheen without making them slippery.

5. Keeps flies off cows.

6. Restores and cleans chalkboards.

7. Removes lipstick stains.

8. Loosens stubborn zippers.

9. Untangles jewelry chains.

10. Removes stains from stainless steel sinks.

11. Removes dirt and grime from the barbecue grill.

12. Keeps ceramic/terra cotta garden pots from oxidizing..

13. Removes tomato stains from clothing.

14. Keeps glass shower doors free of water spots.

15. Camouflages scratches in ceramic and marble floors..

16. Keeps scissors working smoothly.

17. Lubricates noisy door hinges on vehicles and doors in homes.

18. It removes black scuff marks from the kitchen floor. Use WD-40 for those nasty tar and scuff marks on flooring. It doesn’t seem to harm the finish and you won’t have to scrub nearly as hard to get them off. Just remember to open some windows if you have a lot of marks.

19. Bug guts will eat away the finish on your car if not removed quickly. Use WD-40.

20. Gives a children’s playground gym slide a shine for a super fast slide.

21. Lubricates gear shift and mower deck lever for ease of handling on riding mowers.

22. Rids kids rocking chairs and swings of squeaky noises.

23. Lubricates tracks in sticking home windows and makes them easier to open.

24. Spraying an umbrella stem makes it easier to open and close.

25. Restores and cleans padded leather dashboards in vehicles, as well as vinyl bumpers.

26. Restores and cleans roof racks on vehicles.

27. Lubricates and stops squeaks in electric fans.

28. Lubricates wheel sprockets on tricycles, wagons, and bicycles for easy handling.

29. Lubricates fan belts on washers and dryers and keeps them running smoothly.

30. Keeps rust from forming on saws and saw blades, and other tools.

31. Removes splattered grease on stove.

32. Keeps bathroom mirror from fogging.

33. Lubricates prosthetic limbs.

34. Keeps pigeons off the balcony (they hate the smell).

35. Removes all traces of duct tape.

36. Folks even spray it on their arms, hands, and knees to relieve arthritis pain.

37. Florida ‘s favorite use is: ‘cleans and removes love bugs from grills and bumpers.’

38. The favorite use in the state of New York , WD-40 protects the Statue of Liberty from the elements.

39. WD-40 attracts fish. Spray a little on live bait or lures and you will be catching the big one in no time. Also, it’s a lot cheaper than the chemical attractants that are made for just that purpose. Keep in mind though, using some chemical laced baits or lures for fishing are not allowed in some locations.

40. Use it for fire ant bites. It takes the sting away immediately and stops the itch.

41. WD-40 is great for removing crayon from walls. Spray on the mark and wipe with a clean rag.

42. Also, if you’ve discovered that your teenage daughter has washed and dried a tube of lipstick with a load of laundry, saturate the lipstick spots with WD-40 and rewash. Presto. The lipstick is gone.

43. If you sprayed WD-40 on the distributor cap, it would displace the moisture and allow the car to start.

As you can see, WD-40 makes a fine, fine lubricant, though it is not necessarily recommended you use it lube up a vagina, or whatever.

The basic ingredient in WD-40?

Fish oil.

The Great Abraham Lincoln Myth

Abraham Lincoln, the sixteenth President of the United States, and who on February 12th turned 201-years-old, was a devoted and life-long white supremacist — and remained so up until the day he died.

Nor did he waver in his staunch advocacy of colonization — which is the deportation of black people from the United States.

As Lincoln himself expressed it:

“Negroes have natural rights, however, as other men have, although they cannot enjoy them here … no sane man will attempt to deny that the African upon his own soil has all the natural rights that instrument vouchsafes to all mankind.”

In his 1858 debate with Judge Stephen A. Douglas, Abraham Lincoln stated:

“Judge Douglas has said to you that he has not been able to get from me an answer to the question whether I am in favor of Negro citizenship…. I tell him very frankly that I am not in favor of Negro citizenship.”

And later in the debate:

There is a physical difference between the white and black races which I believe will for ever forbid the two races living together on terms of social and political equality. And inasmuch as they cannot so live, while they do remain together there must be the position of superior and inferior, and I as much as any other man am in favor of having the superior position assigned to the white race.

Don’t believe it? Then don’t read Lerone Bennett Jr.’s 662-page book on the subject, nor the excellent review of that book Thomas Dilorenzo:

Bennett is incensed by the fact that Lincoln never opposed Southern slavery but only its extension into the territories. Indeed, in his first inaugural address [Lincoln] pledged his everlasting support for Southern slavery by making it explicitly constitutional with the “Corwin Amendment,” that had already passed the U.S. House and Senate.

The reason Lincoln gave for opposing the extension of slavery was, in Lincoln’s own words, that he didn’t want the territories to “become an asylum for slavery and [N-word, plural].” He also said that he didn’t want the white worker to be “elbowed from his plow or his anvil by slave [N-word, plural].” It was all economics and politics, in other words, and not humanitarianism or the desire to “pick the low-hanging fruit” by stopping slavery in the territories.

Lincoln not only talked like a white supremacist; as a state legislator he supported myriad laws and regulations in Illinois that deprived the small number of free blacks in the state of any semblance of citizenship. Bennett gives us chapter and verse of how he supported a law that “kept pure from contamination” the electoral franchise by prohibiting “the admission of colored votes.” He supported the notorious Illinois Black Codes that made it all but impossible for free blacks to earn a living; and he was a “manager” of the Illinois Colonization Society that sought to use state tax revenues to deport blacks out of the state. He also supported the 1848 amendment to the Illinois constitution that prohibited the immigration of blacks into the state. As president, he vigorously supported the Fugitive Slave Act that forced Northerners to hunt down runaway slaves and return them to slavery for a bounty. Lincoln knew that this law had led to the kidnapping of an untold number of free blacks who were thrown into slavery.

Quoting the man whom Lincoln himself put in charge of “Negro emigration” (i.e. deportation):

“[Abraham Lincoln] remained a colonizationist and racist until his death.”

What Causes Such Shocking Poverty?

Did you know there’s never been a real famine in the United States?

One thing alone is responsible for that fact, and that one thing is this:

Private property rights.

It is the absence of fully protected property that creates poverty.

As the brilliant Peruvian economist Hernando de Soto puts it in his book — which I highly recommend — The Mystery of Capital:

Many of the poorest countries in the world possess enormous amounts of capital, but their ownership is insecure because of faulty or nonexistent property law or property rights protection. The value of private savings in the ‘poor’ countries of the world is forty times the amount of foreign aid they have received since 1945. [The citizens of poorer countries] hold these resources in defective forms: houses built on land whose ownership rights are not adequately recorded, unincorporated businesses with undefined liability, industries located where financiers and investors cannot see them. Because the rights to these possessions are not adequately documented, these assets cannot readily be turned into capital, cannot be traded outside of narrow local circles, cannot be used as collateral…

(This, incidentally, is also the fundamental reason that the Native American Indian Reservations exist is such a horrific state of grinding poverty: our good progressive government — right and left — doesn’t allow Native Americans to own property: i.e. they exist by governmental permission.)

Compare that to property laws in the west where, says de Soto, “every parcel of land, every building, every piece of equipment, or store of inventories is represented in a property document that is the visible sign of a vast hidden process that connects all these assets to the rest of the economy” (Ibid).

Private property is the crux of prosperity.

Please make no mistake about that.

And property, never forget, is nothing more, or less, than an extension of person.

The cornerstone of all socialist-Marxist theory, on the other hand, is, as Karl Marx himself famously put it, “the abolition of private property.”

When will this monstrous ideology and its legions of proponents and practitioners be at last held accountable for creating the shocking poverty such as we see in the photo above?

When?

[Laissez-faire] stands alone as the only feasible way to rationally organize a modern economy. At this moment in history, no responsible nation has a choice (ibid).

The Great Overpopulation Myth

The population of the entire world could fit shoulder-to-shoulder in a space about the size of Jacksonville, Florida.

Ninety-seven percent of the earth’s land surface is empty.

If you allotted to each person 1,250 square feet (which is quite a bit), all the people in the world would fit into the state of Texas.

According to the Food and Agriculture Organization, world food supplies exceed requirements in all world areas, amounting to a surplus approaching 50% in 1990 in the developed countries, and 17% in the developing regions.

Problems commonly blamed on ‘overpopulation’ are the result of bad economic policy. For example, Western journalists blamed the Ethiopian famine on ‘overpopulation,’ but that was simply not true. The Ethiopian government caused it by confiscating the food stocks of traders and farmers and exporting them to buy arms. That country’s leftist regime, not its population, caused the tragedy. In fact, Africa, beset with problems often blamed on ‘overpopulation,’ has only one-fifth the population density of Europe, and has an unexploited food-raising potential that could feed twice the present population of the world, according to estimates by Roger Revelle of Harvard and the University of San Diego. Economists writing for the International Monetary Fund in 1994 said that African economic problems result from excessive government spending, high taxes on farmers, inflation, restrictions on trade, too much government ownership, and over-regulation of private economic activity. There was no mention of overpopulation.

The government of the Philippines relies on foreign aid to control population growth, but protects monopolies which buy farmers’ outputs at artificially low prices, and sell them inputs at artificially high prices, causing widespread poverty. Advocates of population control blame “overpopulation” for poverty in Bangladesh. But the government dominates the buying and processing of jute, the major cash crop, so that farmers receive less for their efforts than they would in a free market. Impoverished farmers flee to the city, but the government owns 40% of industry and regulates the rest with price controls, high taxes and unpublished rules administered by a huge, corrupt, foreign-aid dependent bureaucracy (Dr. Jacqueline R. Kasun).

The world’s population is expected to max out at around 8 billion by 2050. Then it starts to decline.

That’s when the real trouble begins.

How Did Slavery Ever Become A Legal Institution?

In the beginning, and for several decades afterward, slavery was not primarily a governmental institution, neither in Europe, nor the United States.

Initially, the enslavement of Africans was almost all done privately.

There were, to be sure, a handful of governmental charters, but in the early days, the preponderating number of slaves were traded by private entrepreneurs who exchanged rum, spices, and other items to tribal chiefs for Africans whom these same tribal chiefs had already enslaved. In essence, they were relocated.

Make no mistake, however: the European traders were indeed responsible for perpetuating that barbaric institution; but they were not the people responsible for “enslaving the tribe that had lost a war or the man who had fallen into debt or the child sold by the family,” as historian Roger McGrath put it. That blame goes directly to the tribal African chiefs.

In fact, for a very long time slavery was not recognized as a legal institution in the colonies of this country. Thus, the first Africans weren’t, strictly speaking, slaves but rather indentured servants.

The fact of it becoming a legalized institution in the United States was actually brought about by a black man named Anthony Johnson, himself an erstwhile slave back in Africa, and then an indentured servant in the American colonies. After his indentured servitude had expired, Mr. Johnson was granted land in Virginia, where he subsequently acquired several indentured servants of his own – among them, one John Castor, an African who had been sold to him while already in the American colonies.

It was these same men, John Castor and Anthony Johnson, both black, who were initially responsible for the institution of slavery becoming recognized legally in this country.

When John Castor’s years of indentured servitude were finished, he was not immediately granted his freedom. And so he sued for it, as well he should have, as you and I would have too.

But Anthony Johnson, his owner, fought back, alleging in court that John Castor had never entered into what they called a “contract of indenture” but had been bought in toto as a slave in Africa. In a landmark decision, in 1654, the high court of the colony of Virginia found in Anthony Johnson’s favor, pronouncing that “John Castor was a servant for life.”

Chilling words, which no human should ever have to hear.

This was a monumental and precedent-setting case, later cited to weariness by the Southern colonies, so that slavery was soon officially institutionalized.

The fact that two black men are in large part the authors of American slavery is a piece of American history well worth teaching, no matter how postmodern the curriculum.

It is also a fact that black Americans held slaves all throughout the Civil War.

“In 1860, some 3,000 blacks owned nearly 20,000 black slaves. In South Carolina alone, more than 10,000 blacks were owned by black slaveholders. Born a slave in 1790, William Ellison owned 63 slaves by 1860, making him one of Charleston’s leading slaveholders. In the 1850 census for Charleston City, the port of Charleston, there were 68 black men and 123 black women who owned slaves. In Louisiana’s St. Landry Parish, according to the 1860 census, black planter Auguste Donatto owned 70 slaves and farmed 500 acres of cotton fields” (“Slavery’s Inconvenient Facts,” Chronicles, November 2001).

In terms of total population, white or black, the majority of people of either color did not own slaves in the south. In fact, “75 percent of Southerners neither owned slaves themselves nor were members of families who did” (Ibid).

Logical Fallacies

A reader writes:

Dear Ray: I’ve always been told it’s better to be shot at and missed than shit at and hit. While getting shit on obviously does suck, getting shot at means someone doesn’t like you enough to want to shoot at you in the first place. So is it really better?

Scatman

Dear Scatman: I’m afraid your question contains a logical fallacy which I cannot let pass by without at least partially fleshing out. But that doesn’t make it a total waste. You, sir, have committed the fallacy of insufficient feculence — not nearly as egregious as, for example, equivocating on the critical issue of pulling out.

I pray, sir, that this doesn’t sound like a load of crap to you, and please don’t cut me off before we finish our business here, but you simply cannot reasonably infer that “getting shot at [and missed] means that someone doesn’t like you enough to want to shoot at you in the first place.” That’s just BS. It’s also hasty. And only an adversary epistemology advocates haste. In fact, the person shooting at you may very much want to put a slug in your guts, but he may just be a bad shot — for instance, because he has no stool upon which to rest his gun, or perhaps there’s too much movement in other ways.

In any case, the answer to your loaded question is unequivocal: it is indeed far better to be shot at and missed. And that’s no shit.

Follow up question:

Dear Ray: I read your response to Scatman, and I thought it was rock-solid advice. So I thought I’d write in with a question of my own, along somewhat similar lines:

Is it OK to put Germ X (or Purell) on my anus?

Red Button

Dear Red Button: Man, what is with you assholes? Your question is ambiguous — another logical fallacy. The answer depends upon what you mean by “OK.” If by “OK” you mean peculiar, then, yes, it is definitely “OK.” And if by “OK” you mean potentially pathophobic, misophobic, or otherwise inordinately concerned with personal hygiene, then, yes, it is definitely “OK.” But if by “OK” you mean perfectly safe, we run aground.

You see, ethyl alcohol, which is what these hand sanitizers use to kill germs, has indeed been known to cause problems: namely, the problem of pruritus, which, like writing a symphony (according to Brahms), “is no joke.” Pruritus is a rare side-effect, however, and so I imagine that your anus (insofar as I’m able to imagine your anus at all, which isn’t, thank heavens, much) will probably be, as you say, “OK.” If you do go that route, though, Ray recommends using an aloe-vitamin-e-moisturizing variety of sanitizer, thereby killing two birds with one stone. Why not?