Postmodernism: The Destruction of Thought

[Note: The following appeared, in slightly altered form, in a previous article, but I’ve added a new beginning.]

The only real way that knowledge and human progress can be derailed is by the systematic rejection of inductive reasoning, which forms the underpinnings not just of all science and the scientific-method, but of the entirety of human apprehension.

No scientist— whether researcher or practitioner or both, whether biologist, chemist, physicist, geologist, climate scientist, or any other —none can pursue knowledge without first having a view of what knowledge is and how that knowledge is acquired.

All scientists, therefore, whether they know it explicitly or not, need a theory of knowledge.

This theory must come from the most fundamental science: the science of philosophy.

The science of knowledge specifically belongs to that branch of philosophy called epistemology.

Epistemology?—?from the Greek word episteme, which means “knowledge”?—?is an extraordinarily complicated discipline that begins with three simple words: consciousness is awareness.

All scientists, I repeat, need a theory of knowledge, and this theory of knowledge subsequently affects every aspect of a scientist’s approach to her research?—?from the questions she asks, to the answers she found, to hypothesis and theories then developed and built-upon.

Very rare geniuses like Galileo and Newton and perhaps even Kepler (who, for all his mathematical brilliance and tireless work, held to a metaphysical viewpoint deeply flawed) were ferociously innovative in epistemology as well as physics —specifically, in systematizing and codifying the core principles of the inductive-method, which they all three came to through their scrupulous use of scientific experiment.

Induction more than anything else?—?including deduction?—?is the method of reason and the key to human progress.

A proper epistemology teaches a scientist, as it teaches everyone else concerned with comprehension and actual learning, how to exercise the full power of the human mind?—?which is to say, how to reach the widest abstractions while not losing sight of the specifics or, it you prefer, concretes.

A proper epistemolgy teaches how to integrate sensory data into a step-by-step pyramid of knowledge, culminating in the grasp of fundamental truths whose context applies to the whole universe. Galileo’s laws of motion and Newton’s laws of optics, as well as his laws of gravity, are examples of this. If humans were to one day transport to a sector of the universe where these laws did not hold true, it still wouldn’t invalidate them here. The context here remains. In this way, knowledge expands as context grow. The fact that all truths are by definition contextual does not invalidate absolute truth and knowledge thereby, but just the opposite: context is how we measure and validate truth.

Induction more than anything else — including deduction — is the method of reason and the key to human progress.

A proper epistemology teaches a scientist, as it teaches everyone else concerned with comprehension and actual learning, how to exercise the full power of the human mind — which is to say, how to reach the widest abstractions while not losing sight of the specifics, or concretes.

A proper epistemolgy teaches how to integrate sensory data into a step-by-step pyramid of knowledge, culminating in the grasp of fundamental truths whose context applies to the whole universe.

Epistemologically, postmodernism is the rejection of this entire process.

Postmodernism, in all its vicious variations, is a term devoid of any real content, and for this reason dictionaries and philosophy dictionaries offer very little help in defining it.

And yet postmodernism has today become almost universally embraced as the dominant philosophy of science — which is the primary reason that science crumbles before our eyes under its corrupt and carious epistemology.

Postmodernism, like everything else, is a philosophical issue. Accordingly, postmodernism’s tentacles have extended into every major branch of philosophy — from metaphysics, to epistemology, to esthetics, to ethics, to politics, to economics.

In order to get any kind of grasp on postmodernism, one must grasp first that postmodernism doesn’t want to be defined. Its distinguishing characteristic is in the dispensing of all definitions — because definitions presuppose a firm and comprehensible universe. Accurate definitions are guardians of the human mind against the chaos of psychological disintegration.

You must understand next that postmodernism is a revolt against the philosophical movement that immediately preceded it: Modernism.

We’re told by postmodernists today, that modernism and everything that modernism stands for is dead.

Thus, whereas modernism preached the existence of independent reality, postmodernism preaches anti-realism, solipsism, and “reality” as a term that always requires quotation marks.

Whereas modernism preached reason and science, postmodernism preaches social subjectivism and knowledge by consensus.

Whereas modernism preached free-will and self-governance, postmodernism preaches determinism and the rule of the collective.

Whereas modernism preached the freedom of each and every individual, postmodernism preaches multiculturalism, environmentalism, egalitarianism by coercion, social-justice.

Whereas modernism preached free-markets and free-exchange, postmodernism preaches Marxism and its little bitch: statism.

Whereas modernism preached objective meaning and knowledge, postmodernism preaches deconstruction and no-knowledge — or, if there is any meaning at all (and there’s not), it’s subjective and ultimately unverifiable.

In the words of one of postmodernism’s high priests, Michel Foucault: “It is meaningless to speak in the name of — or against — Reason, Truth, or Knowledge.”

Why?

Because according to Mr. Foucault again: “Reason is the ultimate language of madness.”

We can thus define postmodernism as follows:

It is the philosophy of absolute agnosticism —agnosticism in the literal sense of the word — meaning: a philosophy that preaches the impossibility of human knowledge.

What this translates to in day-to-day life is pure subjectivism, the ramifications of which are, in the area of literature, for example, no meaning, completely open interpretation, unintelligibility.

Othello, therefore, is as much about racism and affirmative-action as it is about jealousy.

Since there is no objective meaning in art, all interpretations are equally valid.

Postmodernism is anti-reason, anti-logic, anti-intelligibility.

Politically, it is anti-freedom. It explicitly advocates leftist, collectivist neo-Marxism and the deconstruction of industry, as well as the dispensing of inalienable rights to property and person.

There is, however, a profound and fatal flaw built into the very premise of postmodernism, which flaw makes postmodernism impossible to take seriously and very easy to reject:

If reason and logic are invalid and no objective knowledge is possible, then the whole pseudo-philosophy of postmodernism is also invalidated.

One can’t use reason and the reasoning process, even in a flawed form, to prove that reason is false.

Free Will

A reader writes:

Dear Ray Harvey: Can you prove that humans possess the faculty of choice?

— Waffling

Dear Waffling: Yes, I can. And so can you. But first let me point something out: in the same way that you could not ever conceive dreaming if you’re never awake, so you can never conceive choice if you’re never actually able to choose. In other words, the fact that you’re even able to conceive of volition at all goes a very long way in proving it. Choice is an inherent — and definitional — part of the rational faculty, which humans alone possess.

While you’re at your computer right now, think of some small and inconsequential task you could perform — moving your mouse arrow to a certain quadrant of the screen, for instance, or tapping your spacebar once — but do not actually do it. Whatever the small task is that you conceive of, fix it in your mind for a moment. Observe yourself. Observe that in this moment you can choose to perform that small task, or you can choose not to perform it. You have an alternative, and your will alone is what will determine the outcome. Observe that what determines your choice is your decision to do it or not. That decision is your freedom of will. Quoting (again) the philosophical psychologist Rollo May:

When we analyze will with all the tools that modern psychology brings us, we shall find ourselves pushed back to the level of attention or inattention as the seat of will. The effort which goes into the exercise of will is really effort of attention; the strain in willing is the effort to keep the consciousness clear, i.e. the strain of keeping attention focused” (Rollo May, Love and Will, 1969).

Now, Waffling, decide, one way or the other, and then follow through with your decision.

That’s all the proof you need: direct observation. All knowledge starts and ends with observation.

Logical Fallacies

A reader writes:

Dear Ray: I’ve always been told it’s better to be shot at and missed than shit at and hit. While getting shit on obviously does suck, getting shot at means someone doesn’t like you enough to want to shoot at you in the first place. So is it really better?

Scatman

Dear Scatman: I’m afraid your question contains a logical fallacy which I cannot let pass by without at least partially fleshing out. But that doesn’t make it a total waste. You, sir, have committed the fallacy of insufficient feculence — not nearly as egregious as, for example, equivocating on the critical issue of pulling out.

I pray, sir, that this doesn’t sound like a load of crap to you, and please don’t cut me off before we finish our business here, but you simply cannot reasonably infer that “getting shot at [and missed] means that someone doesn’t like you enough to want to shoot at you in the first place.” That’s just BS. It’s also hasty. And only an adversary epistemology advocates haste. In fact, the person shooting at you may very much want to put a slug in your guts, but he may just be a bad shot — for instance, because he has no stool upon which to rest his gun, or perhaps there’s too much movement in other ways.

In any case, the answer to your loaded question is unequivocal: it is indeed far better to be shot at and missed. And that’s no shit.

Follow up question:

Dear Ray: I read your response to Scatman, and I thought it was rock-solid advice. So I thought I’d write in with a question of my own, along somewhat similar lines:

Is it OK to put Germ X (or Purell) on my anus?

Red Button

Dear Red Button: Man, what is with you assholes? Your question is ambiguous — another logical fallacy. The answer depends upon what you mean by “OK.” If by “OK” you mean peculiar, then, yes, it is definitely “OK.” And if by “OK” you mean potentially pathophobic, misophobic, or otherwise inordinately concerned with personal hygiene, then, yes, it is definitely “OK.” But if by “OK” you mean perfectly safe, we run aground.

You see, ethyl alcohol, which is what these hand sanitizers use to kill germs, has indeed been known to cause problems: namely, the problem of pruritus, which, like writing a symphony (according to Brahms), “is no joke.” Pruritus is a rare side-effect, however, and so I imagine that your anus (insofar as I’m able to imagine your anus at all, which isn’t, thank heavens, much) will probably be, as you say, “OK.” If you do go that route, though, Ray recommends using an aloe-vitamin-e-moisturizing variety of sanitizer, thereby killing two birds with one stone. Why not?

Proof Of God?

A reader writes:

Dear Harvey Ray: Is there proof of God? Can science prove that God doesn’t exist?

Signed,

Hopelessly Devoted

Dear Hopelessly Devoted: No, science cannot. In fact, nothing can. Yet we can be certain that God doesn’t exist — by virtue of the very nature of proof.

The meaning of proof precludes proving something for which there is no evidence.

God is primarily of metaphysical and ethical import. Proof, on the other hand, is epistemological.

Proof is an overwhelming preponderance of evidence that admits no alternative.

Proof, by definition, requires evidence. Indeed, proof is evidence.

For this reason, the attempt to prove something for which there is no evidence is a contradiction in terms. The philosophy of science presupposes this principle, but historically, up to the present day, it’s been poorly defended.

You’ve no doubt heard the platitude: “You can’t prove a negative.”

The reason this statement contains a kernel of truth is that proof requires data, as opposed to an absence of data. And that is why the burden of proof falls upon the person making the claim.

If, for example, you claim that little green men exist inside the human brain, and that these green men are responsible, through an intricate process of lever-pulling, for human consciousness, it is you who must prove this — by providing data — and not us who must disprove it.

What you’re really referring to in your excellent question is a thing epistemologists call evidentialism, or the law of the arbitrary.

If the onus of proof were on me, for instance, to prove that these little green men didn’t exist, what, may I ask, do you think that would entail?

I’ll tell you:

Among other things, it would entail that anyone could say whatever he wanted about anything, regardless of data, and I’d have to spend the rest of my life trying to prove him wrong without any data, while he sat back and fabricated more arbitrary claims. And, indeed, many people do just that.

Fortunately for the human race, this is not how the reasoning process actually works.

The proper response to these claims is simply to dismiss them categorically for what they actually are: neither true nor false, but whimsical — that is, arbitrary — until some hard evidence is put forth. But the evidence must come first, before the claim.

That is what you must always remember.

Evidence constitutes proof.

Merely claiming does not constitute evidence; that’s too easy.

Thus, if you claim God or if you claim green men, it is you, not me, who must produce the data.

Epistemologically, there’s no significant difference between green men, God, the Great Spirit, or, for that matter, Grendel.

Which is why for the mystically inclined, fideism is the best bet, although fideism too runs spectacularly aground, but in other ways, less epistemolgic, perhaps, but clearly more dramatic.

Global Warming

Politically, global warming and climate change have little if anything to do with climate science, and the fact that this subject has become such an overwhelming political issue is a fine testament to how poorly the world understands the legitimate functions of government, and why those functions are legitimate.

Indeed, it turns out that the whole anthropogenic global warming (AGW) position can be easily defused without any reference to science at all, because the error, at root, is epistemological.

The truth about global warming which many don’t want to hear is that it’s become so polarized only because it’s turned political. The essentials of the subject have thereby been swallowed up in a murky ocean of misinformation, equivocation, and propaganda.

Let us start by defining terms:

Statism is concentrated state authority; it refers to a government that believes it has legitimate power to any extent over individual rights and freedom of trade.

Opposition to laissez-faire capitalism derives in part from ethics, but even more fundamentally from the science of epistemology.

Ethically the fundamental political question is this: are humans free by nature?

The answer to that depends upon the answer to an even deeper question: why (if at all) are humans free by nature?

And the answer to that is epistemologic.

The human brain – to address the latter query first – is individuated and rational by nature; because of this, man by nature possesses the faculty of choice.

Rationality is choice.

And choice presupposes the freedom to choose. This is the locus of the inseparable, indivisible link between reason and rights. Ultimately it is only the individual who can exercise the power of volition, or not. Government bureaus cannot. The state cannot. The collective cannot. Only the individuals who make up these entities.

If humans did not possess the faculty of choice, humans would be neither moral nor immoral but amoral, just as animals for this very reason are amoral.

But human action is chosen.

This, then, is what finally gives rise to the fact of human freedom as an epistemological necessity.

It’s also what it means to say that humans are free by nature: we are born with a cognitive faculty that gives us the power of choice; since this faculty is the primary method by which we thrive and keep ourselves alive, we must (therefore) be left free to exercise that faculty — and leave others likewise free.

This is a form of contractarianism.

Please note that this is not just some esoteric theory on how human freedom could conceivably be defended: the rights of each individual are demonstrably rooted in man’s cognitive quiddity – and for this precise reason, human freedom without an accurate and thorough understanding of man’s epistemologic nature can never be fully understood.
Or defended.

In the words of Samuel Adams:

“Rights are evident branches of, rather than deductions from, the duty of self-preservation, commonly called the first law of nature.”

And Claude Fredrich Bastiat:

“For what are our faculties but the extension of our individuality? And what is property but an extension of our faculties? … Man can live and satisfy his wants only by ceaseless labor, and by the ceaseless application of his faculties to natural resources.”

It is precisely the lack of epistemological grounding that has made rights and therefore human freedom vulnerable throughout all of history.

The evolution of the human brain created rights; it happened at the exact moment when this same evolution created a rational animal called a human being – which is to say, when nature created the capacity of free will.

Philosophy, then, being the most general science, unifies facts from all disciplines into an indivisible whole.

Thus, without proper philosophical underpinnings, scientific facts, no matter how airtight they are, remain unincorporated.

It is this point that provides us with the real and final connection between global warming and individual rights; for the provenance of rights, including private property rights and the freedom to trade that property, is found ultimately in man’s freedom of will, and it is only statist politics – also known as coercive government – that can with impunity negate the individual’s natural rights.

It does so through force, either directly (as in physical expropriation or imprisonment), or indirectly (as in compulsory taxation or fines).

The statist politics that the AGW position explicitly calls for are in this way antithetical to the methods by which the human brain and the human species properly functions and flourishes.

That is the fundamental argument against statism, in any of its multifarious guises. It is a foolproof argument, and it is the first and strongest line of defense: because each and every individual is free by nature, we are free to, in Adam Smith’s words, “truck, barter, and exchange.”

But there’s much more to it than this.

It must first of all never be forgotten that the philosophy of science is only a species of philosophy proper.

This has crucial ramifications.

Science is the systematic gathering of data through observation and reason.

Science is built upon knowledge, and knowledge is built upon reason.

Reason derives from the nature of the human mind, for man is the rational animal.

Epistemology – one of the two main branches of philosophy – is the science of knowledge.
Epistemology, therefore, studies the nature of reason.

In this way, all science is hierarchically dependent upon epistemology.

In the realm of human conviction, there exists at any given time only three primary alternatives: possible, probable, and certain.

Possible is when some evidence exists, but not much.

Probable is when a lot of evidence exists, but not all.

Certain is when the evidence is so overwhelming that no other conclusion is possible.

Obviously, then, what constitutes possible, probable, or certain is the amount of evidence and the context of knowledge within which that evidence is found.

To conclude certain, or even “over 99 percent certain,” to quote James Hansen of NASA, requires a sufficient knowledge of all relevant data and all potentially relevant data.

This is as true in a scientific laboratory as it is in a court of law.

It means that nothing – the complexity of clouds, for instance, or aerosols, deep ocean currents, cosmic rays, sun spots, et cetera – nothing is poorly understood, or insufficiently understood.

It means that the science has culminated to such a degree that our knowledge of it is complete or near-complete – so much so, at any rate, that there is essentially very little left to learn.

It means that because the evidence is so great, the conclusion admits no doubt.

It means, moreover, that the data-gathering process is not biased or influenced in any way by anything extracurricular, like activism.

Such is the nature of certainty.

From an epistemological standpoint, certainty means absolute.

And yet it’s many of these same AGW scientists who, today, under the insidious influence of postmodernism, assure us that there are no absolutes in science – “science doesn’t deal in truth, but only likelihood,” to quote another NASA scientist, Gavin Schmidt.

Truth is only relative, you see.

Quantum physics and thermodynamics have “proven” that the only certainty is that nothing is certain; definitions are purely a question of semantics; a unified philosophy is “circular reasoning” (or, at best, “system-building”); all moral law and all social law is subjective and unprovable.

The mind, in short, cannot know anything for certain. Yet AGW is virtually certain.
These are all epistemological assertions.

Syllogistically, the entire anthropogenic global warming position can be recapitulated in this way:

Global warming is man-made. Man is ruled by governments. Therefore, government bureaus, centralized planning committees, and more laws are the only solution.

In philosophy, this is called a non-sequitur.

It does not follow.

It’s far too hasty.

Please read Chapter 15 of my book to find out why.

The Difference Between a Cynic and a Skeptic

Antisthenes
The difference between the cynic and the skeptic is the difference between epistemology and ethics. It is the difference between brain and body.

Skepticism is an epistemological word. Cynicism is ethical.

Epistemology is the branch of philosophy that deals with knowledge.

Ethics is the branch of philosophy that deals with morality.

The Greek word skopein – from which the English word scope derives – means “to observe, aim at, examine.” It is related to the Greek skeptesthai, which means “to look out.” Skepsis and skeptikos are also both Greek and mean “to look; to enquire; to aim.” Those are the etymological roots of the word sceptic.

Sceptic – or, if you’re in the United States, skeptic, the difference being purely one of form and not substance – has its origins in the Ancient Greek thinkers who developed arguments which purport to show that knowledge is either impossible (Academic Scepticism) or that there is never sufficient data to tell if knowledge is possible (Pyrrhonian Scepticism).

Academic Scepticism rejects certainty but accepts degrees of probability. In this sense, Academic Scepticism anticipates elements of present-day quantum theory. The Academic Sceptics rejected certainty on the grounds that our senses (from which all knowledge ultimately derives) are unreliable and reason therefore is unreliable since, say the Academic Sceptics, we can find no guaranteed standard by which to gauge whether our convictions are true. This claim rests upon the notion that humans can never know anything that is absolutely false.

The roots of Academic Scepticism are found in Socrates famous apothegm: “All I know is that I know nothing.” The word “Academic” in “Academic Scepticism” refers to Plato’s Academy, third century B.C.

At around this same time, a fellow by the name of Pyrrho of Elis (c.360-275 B.C.), who was connected with the Methodic School of Medicine in Alexandria, founded a school, which soon came to be known as Pyrrhonian Scepticism. Pyrrho’s followers – most notably a loyal student named Timon (c.315-225 B.C.) – were called Pyrrhonists. None of Pyrrho’s actual writings have survived, and the theoretical formulation of his philosophy comes mainly from a man named Aenesidemus (c.100-40 B.C.).

The essential difference between these two schools of Ancient Greek scepticism is this:

The Pyrrhonists regarded even the claim “I know only that I know nothing” as claiming too much knowledge. There’s even a legend that Pyrrho himself refused to make a definitive judgment of knowledge even if “chariots were about to strike him dead,” and his students purportedly rescued him a number of different times because he refused to make commitments.

Pyrrho of Elis
To this day the term Pyrrhonist is synonymous with the term sceptic, which is also synonymous with the term agnostic (a meaning “without”; gnosis meaning “knowledge”).

It’s perhaps worth pointing out as well that the word agnostic in this context was, according to the Oxford English Dictionary, coined by Thomas Henry Huxley, in the spring of 1869, at a party, in which there was reportedly “much licking and sucking.” According to R. H. Hutton, who was there: “Huxley took it from St. Paul’s mention of the altar to ‘the Unknown God.’”

In truth, however, the word agnostic was most likely first used by a woman named Isabel Arundell, in a letter to Huxley. Huxley stole it from and gave her no credit.

The Oxford English Dictionary (Unabridged, 2004) lists four meanings of the term sceptic, which are as follows:

1. one who, like Pyrrho and his followers in Greek antiquity, doubts the possibility of real knowledge of any kind; one who holds that there are no adequate grounds for certainty. Example: “I am apt to think there never yet has really been such a monster in the world as a sceptic” (Tucker, 1768).

2. one who doubts the validity of what claims to be knowledge … popularly, one who maintains a doubting attitude with reference to some particular question or statement; one who is habitually inclined to doubt rather than to believe any assertion or apparent fact that comes before him. Example: “If every sceptic in Theology may teach his follies, there can be no religion” (Samuel Johnson, 1779).

3. one who doubts without absolutely denying the truth of the Christian religion or important party of it; loosely, an unbeliever in Christianity. Example: “In listening to the arguments of a sceptic, you are breathing a poisonous air” (R.B. Girdlestone, 1863).

4. occasionally, from its etymological sense: a truth seeker; an inquirer who has not yet arrived at definite conclusions. Example: “A sceptic, then, is one who shades his eyes in order to look steadfastly at a thing.” (M.D. Conway, 1870).

The anthropogenic global warming debate has catapulted this latter definition to the forefront, yet many purists, who know the philosophical roots of the word scepticism, are not always comfortable using it in this way — mainly because it’s so at odds with the philosophical meaning of the term. Scepticism has over 2,000 years of heavy philosophical baggage, and to call yourself a sceptic in the philosophical sense entails much more than one “who shades his eyes in order to look steadfastly at a thing.”

Language, however, as everyone knows, is a living, breathing organism which will and properly should evolve, and it would be very bad to say that sceptic in this latter sense is incorrect. And yet there is another word, more precise and less laden: Evidentialism.

True scepticism — which is to say, agnosticism, which is to say, Pyrrhonism — rejects the possibility of all knowledge, and yet it is precisely this that the scientist seeks, and finds: knowledge. What is knowledge?

Knowledge is the apprehension of reality based upon observation and reason; reason is the uniquely human faculty of awareness, the apparatus of identification, differentiation, and incorporation. Knowledge is truth, and truth is the accurate identification of reality. Veritas est adaequatio rei et intellectus. Truth is the equation of thing and intellect.

For example, when the child grasps that 1 unit combined with 2 other units makes a total of 3 units, that child has discovered a truth. She has gained knowledge. The philosophical sceptic rejects this elementary fact.

The philosophical sceptic is defined by three words: “I don’t know.”

The scientific sceptic, on the other hand, is defined by rational inquiry — someone who investigates with a disposition to be persuaded and yet does not (in the words of perhaps the most famous sceptical inquirer of them all) “insensibly twist facts to fit theories, instead of twisting theories to fit facts.”

A cynic, on the other hand, is someone who doesn’t believe goodness is possible.

Cynicism is a moral concept, not epistemologic.

The word originated with a Greek fellow by the name of Antisthenes (not to be confused with Antihistamines, which are something else), who was once a student of Socrates.

Antisthenes had a notorious contempt for human merit and human pleasure, and that is why to this day the word cynic denotes a sneer.

The cynic rejects goodness; the skeptic rejects knowledge.

Both words, it should also be noted, do, however, have one very important thing in common: from a philosophical standpoint, they’re each stupendously incorrect.

This article first appeared, in slightly different form, at Dr. Jennifer Marohasy’s website.

The comments there are well worth reading.

Antisthenes

Epistemology: The Science Of Thought

Epistemology is the science of knowledge. The word derives from the Greek episteme, which means knowledge.

Epistemology proper didn’t actually begin until Rene Descartes (1596-1650), but the stuff of epistemology — logic, reason, deduction, induction, et cetera — has been with us since the Ancient Greeks.

Epistemology is an extraordinarily complicated discipline that starts with three simple words:

Consciousness is awareness.

That is an epistemological axiom which cannot be refuted or denied: any theory of knowledge that purports to refute that consciousness is awareness must rely on the awareness of his consciousness to refute it.

First there exists the external world, and then there exists the awareness of it.

These two things are separate, but not equal: by definition, existence comes first, before there can be an awareness of it.

In the words of the philosopher Douglas B. Rasmussen:

“Consciousness is ultimately of or about something other than itself — it is ultimately relational.”

The argument that one cannot prove anything beyond one’s own consciousness was, contrary to what you may have heard, refuted long ago, and thoroughly so, by Thomas Aquinas, when he wrote the following:

“No one perceives that he understands except from this, that he understands something: because he must first know something before he knows that he knows.”

This insight was explicated upon by the neo-Thomist priest Celestine Bittle, in his 1945 textbook The Whole Man:

“Consciousness,” says Father Bittle, “is irreducible [because] consciousness can’t be reduced to other facts or broken into component parts.”

Father Bittle goes on to describe consciousness as “an ultimate datum of experience … at the very root of all mental activity.”

This is called by neo-Thomists “the reflexive nature of consciousness,” which means that consciousness, by its very definition — by nature of what it is — cannot be conscious of only itself since consciousness is awareness, which by definition means that it must first be aware of some thing.

In other words, “I’m only aware of my faculty of awareness” is a meaningless statement.

Why?

Quoting another erudite neo-Thomist epistemologist, Jacques Maritain:

“The first thing thought about is being independent of the mind…. We do not eat what has been eaten; we eat bread. To separate object from thing is to violate the nature of intellect” (Jacques Maritain, The Degrees of Knowledge, 1938).

The ramifications of all this may be summed up thus:

The existence of the external world (i.e. reality) and the awareness of it (which is to say, consciousness) form the very underpinnings of all knowledge.

Whether scientists know it or not and whether scientists like it or not, every field of every scientific endeavor, and every part of every field of every scientific endeavor, no matter how postmodernistic the curriculum, and no matter how relativistic the agenda, assumes the following:

There exists an external universe, which human consciousness does not in any way create but rather apprehends and measures.

That is the proper starting point of any philosophy of science, as well as the rest of learning.