cosmic-radiation

Why Chance Depends Upon Continuity

In his later writings, Charles Sanders Peirce often incorporated chance as an essential part of his theory of the universe.  Chance, as the “absence of cause”, he says (mistaking this to be the definition of Aristotle), is what allows for the evolutionary development of law and of habit-taking.  This necessity of “spontaneity, chance, play”, is what he calls tychism.  I do not see how chance can in any way be primary, except as a consequence of the principle of potency, and specifically in our experience and experimental/observational capacities, of the potency coextensive with the materiality of something.

Chance is not a something in its own right, but a derivative following upon the variability of material eliminations.  I.e., as the potency of matter is determined through this or that act, different chances arise.  Determination of potency is a prerequisite to the occurrence of chance, inasmuch as what we mean by any chance event is not the absence of a causal event, but the interruption of one usual chain of causality following predetermined ordination by an intersection of some usually unrelated chain of causation, following its own predetermined ordination.

This both allows for the variation evident in evolutionary process–inasmuch as the abnormality occasioned in the intersection need not be an abnormal occurrence, but only abnormal to the object(s) involved–as well as “free will”, since the freedom of the will consists in the ability to choose lesser known-goods over higher known-goods.  If by “chance” Peirce intends the inclusion of what is simply non-deterministic and therefore “spontaneous”, this non-determined and spontaneous event occurs in the election of a lesser over a higher; for it is characteristic of the determinate that it always follows from a kind of brute actuality, the kind of brute actuality which follows from feeling (as opposed to “reason” or the species-specifically human semiotic process).

Therefore, chance can be in fact an essential element to the cosmological progress of the universe, but in the same way as relation is equiprimordial with substance, so too chance (as the consequence of matter or potency) is equiprimordial with determinacy (the consequence of form or act).  The degree to which a being is determined in actuality diminishes the degree to which it is open to chance interactions; just as the degree to which a being is in act substantially diminishes the degree to which it can be really related to other beings.  How does this relate to the theory of entropy?  Does the entropic finality precludes further action because of homogeneity of act, or because of homogeneity of potency?  If the preclusion is based in the homogeneity of act, then it is faulty; for pervasive actuality does not prevent further act, it only prevents changing in act.  We view change as a mechanism of good, because we become bored with this or that good; this or that good being unfulfilling on account of its finitude.  Would not perfect act be perfect good, and therefore perfectly satisfactory?  If entropy, contrariwise, brings an end to action because nothing is in act enough to change something from a state of potency to one of act, then it coheres with the equiprimordiality of chance and determinacy.

Chance being an equiprimordial albeit dependent element in the evolutionary development introduces a difficulty, however; for inasmuch as everything is undetermined and thereby subject to chance alterations, likewise, therefore, those things contain an unintelligible element, and, through that unintelligible element, the possibility of unpredictable change.  Such changes, once occasioned, can indicate the development of a new nature, as well as alter the proprietal consequences proper to a certain nature; or they may illumine for us the inessential quality of attributes previously considered essential.  The pervasive unintelligible possibilities of change makes sure prediction, especially long-term prediction, nearly impossible.

peirce
Peirce: hipster before it was cool.

Simultaneously with his theory of tychism, Peirce develops a theory of synechism, that is, that all things are continuous; that neither the objects of experience nor the realities unexperienced are divisible into absolutely separate spheres.  As a part of this synechist theory, he notes that consciousness can be considered as individual, social, spiritual.  His discussion of consciousness is, however, unhelpful, inasmuch as he approaches it from an aggregationist perspective, thinking about it the continuity of “moments”: if we are conscious of a before, a now, and an after, we realize that the second becomes the third, following from the first; the afterwards is continuous with the now, adjacent to.  There is something often Humean about Peirce’s conceptions of the connection of ideas (contiguity, association, “relation” broadened in a Kantian sense).  This Kantian-Humean influenced is unfortunate.

Despite its flaws, Peirce’s notion of synechism is of great use in “understanding the riddle”, indeed, even for reconciling tychism into an intelligible framework.  As aforementioned, chance is equiprimordial but dependent, the continuity of the objects of experience, the continuity of the things themselves, allows for a unification of the evolutionary, indeterminate, unpredictable future.  That is, the break with the past introduced by a tychic event is not a true break with things themselves, but only with the objectivizing structure–the structure of consciousness whereby a human being makes of a thing an object of consideration; and since the objectivizing structure is dependent on, derivative from, the things themselves, the continuity can be re-established also in the objectivizing structure, in the objective presence to a knower.

Consequently, it becomes a question as to what we really mean by “chance”. That is: is chance really, truly, honestly something random, spontaneous?  Or is it not, rather, that it appears to us as random, because we do not know the antecedent principles, the antecedent actualities, which have not yet occurred in our experience, so as that we could devise the nature of the forms operating as causes?  To admit chance as anything else is to make the universe fundamentally unintelligible; for to be utterly random is to be, in fact, without a cause; which is to say, without a reason, which is, without intelligibility.

Could it, in fact, be the case, as Peirce claims, that the laws which govern behavior the universe are developed, the products of chance, the results of tychic development?  Would not they need to occur on the basis of prior, more fundamental laws?  This is the point he seems to miss: in order for anything to move, it needs a relative unmoved.  Change requires the unchanging.  Even in Peirce’s own system, becoming law and taking habit, the second and third to the first of chance, are themselves kinds of law, kinds of regularity, which need to be present for the governance of the irregular.

Chance is not the random; it is the unanticipated.  It is not contrary to the things effected; it is contrary only to our presuppositions about how the things “should” have been effected.

neural-network

Why Information Differs from Knowledge

You hear it all the time: the internet puts a world of knowledge at your fingertips (and you use it for cat videos and porn).  Parenthetical clause aside, this claim is false.  The internet puts a world of information at your fingertips; but information and knowledge, as we actually use these terms today, do not signify one and the same thing.  Some people consider “knowledge” to be a very highly evolved form of information; or information in action; or simply that “knowledge” and “information” are synonymous.  These supposed likenesses derive from a broadly computational theory of mind–that, as Steven Pinker (in)famously put it, “the mind is what the brain does.”

Information is organized data.  When data are collated, sorted, and structured so as to present an intelligible object, they become information.  The light hitting our eyes is a stream of data; when the eyes, nerves, and brain sort it out into what we know, based on previous experience, as a lamp, we have information.  But the knowledge, “This is a lamp”, differs from the information by means of which we make the judgment.  A dog could have nearly the same data presented to it from the lamp, and yet despite this, despite even having a similar informational basis, it will never know a lamp as a lamp; just as the dog and the human, given the same informational basis, will both regard the roast beef as food, but only the human will know it as food.

Is this mere hair-splitting?  The difficulty stems largely from the English word “knowledge”; the origins of the word dwell in the murk of Middle English, but its history intertwines with its usage to translate two Latin words: scientia and cognitio.  The former word, evident in its transliteration to science, has the precise meaning of logical knowledge, knowledge achieved through explicitly conceptual consideration.  Cognitio, in contrast, covers the broad signification of any mental activity whatsoever.

The dog and the human alike cognize–as would any machines capable of emulating the neurological processing characteristic of higher animal life, machines truly capable of “learning”.  We could, therefore, speak of “artificial cognition”.  But the human alone possesses scientia.  The commonality to all beings, whereby information is processed and either evokes a response, or is deliberately responded to, is known as semiosis.  The term derives from the Greek semeion, meaning “sign”.  A sign, to steal a 17th century Portuguese philosopher’s definition, is that the whole being of which consists in bringing something else to mind.  Precisely as a sign, this is all a sign does.  The significative function of a stop sign, for instance, consists in the command which it brings to mind; the significative temperature of the thermometer is to tell us something’s temperature; the word “Jupiter”, though composed of phones and/or letters (whether spoken or written, though the written is notably, in this case, a sign of the spoken), signifies, depending upon context, either the head of the Roman pantheon or the local gas giant planet (or in a roundabout way, perhaps both).

Likewise, the odor of the roast beef signifies to dog and human alike the presence of food.  All information is semiosic; what we receive, as information, serves literally to inform, to give over a definite structure conveying something other than itself.  When we processes this information, i.e., interpret it, we arrive back at a consideration of what the information signifies (the cognitive reception of information, incidentally, does not entail an introspective interpretation, as a “looking inside the mind”, but [and this is most important], always has us looking back towards the source of the cognitively-received information).

Information, as the unit of semiosis, forms the fundamental basis of all knowledge.  But the distinguishing characteristic of knowledge is this: grasping the meaning of what the information signifies independently of context or environment.  Effective knowledge is always reincorporated into the context; but the de-contextualization provides crucial insight into the nature of the things considered.

What distinguishes the human being among the animals on earth is quite simple, yet was never fully grasped before modern times had reached the state of Latin times in the age of Galileo.  While every animal of necessity makes use of signs, yet because signs themselves consist in relations, and because every relation, real or unreal, is as relation – as a suprasubjective orientation toward something other than the one oriented, be that “other” purely objective or subjective as well – invisible to sense (and hence can be directly understood in its difference from related objects or things, but can never be directly perceived as such), what distinguishes the human being from the other animals is that only human animals come to realize that there are signs distinct from and superordinate to every particular thing that serves to constitute an individual (including the material structure of an individual sign-vehicle) in its distinctness from its surroundings.
-John Deely, The Semiotic Animal

For further reading about semiotics, I recommend the Routledge Companion to Semiotics, Thomas Sebeok’s Signs: an Introduction to Semiotics, and, why not, John Deely’s Basics of Semiotics; and, of course, for the deeply interested, the entire oeuvre of Charles S. Peirce.

dali-madtristan

Sex in a Vacuum

Offering thoughts concerning the nature of sex over the internet–other than “do whatever feels good”, “5 tips to make her scream”, “mastering the female orgasm”, or something along such lines–sometimes seems like standing on the street corner and telling people about the Hellfire and Eternal, Everlasting Damnation awaiting them if they fail to change their awful, sinful ways.  Whether or not we have actually ever listened to what things the street preacher might have to say, we are already tired of hearing them–without ever actually having heard them.  Being told what is “right” and what is “wrong” with regards to sexual behavior infringes upon the same supposed personal sovereignty at stake when someone tells us what beliefs we ought to hold concerning matters of religious conviction.  Sex and religion are personal matters of opinion and preference, and nobody’s business but our own–so anyone publicly preaching against what we hold comes across as a crazed, bigoted, ideologically imperialist ass.

So it is at the risk of being categorized with the crazy street preachers that I write and publish this commentary; though my focus falls not upon “right” and “wrong”, but rather “good” and “bad”, giving not a moral commentary based upon belief in revelation, but rather the provision of a philosophical perspective on something often philosophically neglected.

The Principle of Pleasure
What if We Admitted to Children That Sex Is Primarily About Pleasure?” by Alice Dreger, 16 May 2014

Though over two years old, this article just recently attracted my attention.  The majority is constituted by anecdotes concerning Dreger’s unrelenting directness in discussing sex with her son.  She and her partner are uncompromisingly forthright in answering all of his questions, for the sake of educating him, with a seeming scientific detachment.  It occurs as a shock to her when, explaining the meaning of “accidental pregnancy”, she realizes that she had not yet conveyed to her son that sex is most often pursued for the sake of pleasure, and not procreation.

Consequently–given, at least, the limited exposure to the conversation with her son presented in the article–she goes on to muddy the waters: Dreger points out to her son that the pleasurable sensations which accompany sexual behavior are part of an evolutionary drive to procreate, and then posits that the primary reason people pursue sex is not for the sake of procreation, but for pleasure.  In other words, she divorces the conscious reason from the subconscious drive.

That is: procreation is the reason that sex has evolved into a pleasurable activity; but pleasure is the reason that sex is had.  On her analysis, pleasure is (from a biological standpoint) for the sake of procreation.  But most often people have sex for the sake of pleasure and not procreation.  A cultural development arises, artificial birth control, whereby the two can be easily separated.  Culture becomes systematically opposed to nature, to the biological.  The pursuit of sex remains grounded in a biological urge, but its practice becomes culturally detached from the reason for that urge.

In the final paragraph, Dreger writes: “How funny that we can’t bring ourselves to tell our children the most fundamental truth about sex, that most of the time we have sex, we have it for pleasure.”  She adds that the single act of sex  responsible for the birth of her son has brought her pleasure for years, touchingly sentimental but perhaps unintentionally implying that the sole or primary purpose of children is to provide oneself with pleasure.

A subtle chasm underlies the verbal veneer Dreger lays over her article: ostensibly united, the scientific detachment of an “objective” sexual education and the intimate psychological subjectivization of its purpose in the sensations of pleasure stand schismatically opposed.  Hers is not an uncommon opinion or position held–precariously held though it is.  It is a position which attempts subjugation of the so-called “objective” realities of biological constitution to the attainment of pleasure according to the psychological subjectivity of the individual person: a position characteristic of many having received a higher education; a position which finds its archetypal character in Rodion Romanovitch Raskolnikov.

The Meaninglessness of Sex’s Meaninglessness
The Meaning of Sex“, Marty Klein [Commentary on this article is very selective of necessity; the number of disputable claims is an analytical nightmare.  That said, the weakest points–basic faults of logical reasoning–are ignored, and the strongest arguments–weak though they are–taken up for consideration]

What is the meaning of sex?  Intrinsically, Klein asserts, it has none.  Whatever meaning it has been given stems from cultural or personal imposition; the natural, physical, corporeal action itself maintains no connection to the rationales proferred by speculative thought.  In consequence the only “meaning” which can be ascribed to sex is the purely contingent evaluation of the act which emerges post facto.

“Most people need sex to have [intrinsic] meaning because the alternative is too frightening: having sex in an existential vacuum. Sex without [intrinsic] meaning would require participants to float freely in sexual experience, rather than being snugly anchored in a cognitive framework, an explanation.”

Fear motivates ascription of an intrinsic and therefore universal, “objective”, rather than emergent, contingent, subjective, meaning to sex.  A “cognitive framework, an explanation” gives a snug anchoring, safety.  Could we not say, on just the opposite, that fear motivates the banishment of sex to the purely subjective?  That, dispensing with all standards for justifying one’s sexual behavior is itself an attempt at justification motivated by fear of being rejected, stigmatized?  What is  more fearful: the impossibility of being moral or the possibility of being morally wrong.

The accusation of fearfulness Klein levels at “sexually repressive institutions” rings of dime-store existentialism and the cheap trick of small-minded propagandists: when one cannot argue with the opposition, invent an underlying fear or weakness as their true motivation.  Seeking a “cognitive framework, an explanation” is not cowardice, but the inverse; to shrug and justify behavior “because I like it” is not an act of courage, but is to quit; not to live fully in accord with human inclination, but to repress its most distinctive characteristic, the ability (and desire) to have explanations at all.

Claiming that there no justification exists for sexual preference or behavior is claiming that all preference and behavior is (in principle) justified (and only circumstantially to be prevented or altered, such as in the case that one’s inclinations result in harm to another).  Disavowing the possibility of justification itself justifies.  Klein can only attempt to dispel with explanation by means of explanation.

“So is sex meaningless? Yes and no. It is meaningless in the objective or philosophic sense. But, for better or worse, it is meaningful on the personal, experiential level.”

The separation of “objective” and “personal”, of “philosophic” from “experiential”; the chasm hidden beneath the surface of Dreger’s piece Klein leaves naked.  Positing that something has personally, subjectively, experientially meaning and yet no meaning visible to an “objective” or “philosophic” perspective escapes any but the most vapid of analyses.  “Meaning” by its very nature is “objective” (i.e., accessible to intellectual consideration); the domain of meaning belongs to philosophical inquiry.  I may not have the insight to say what meaning someone else has found in sexual behavior, inasmuch as they have not shared it; but I can say that sexual behavior has this or that meaning.

Two threads weave through Klein’s article: one, the presumption that pleasure–reductively ascribed to the corporeal mechanism–is the central aspect of sexuality; two, the plurality of means whereby people find pleasure in sex disproves the existence of hierarchical “better” and “worse” ways to have sex.  These two threads stem from the common spool of sexual decontextualization.  In other words, sex does not ever occur in an existential vacuum, except in the abstraction of discussion.  To treat sex through such abstraction is not to treat of sex, but to treat, poorly, of one’s abstract conceptualization of sex.

What Is, What Was, What Will Be

Time present and time past
Are both perhaps present in time future,
And time future contained in time past.
-T.S. Eliot, “Burnt Norton”, first of the Four Quartets.

The two articles here considered, Dreger’s and Klein’s, present a common opposition of nature and culture, wherein the goal of culture is to overcome the obstacles nature objects into our pursuit of pleasure.  Martin Heidegger–especially in his later works, such as the 1947 “Letter on ‘Humanism'” and 1953 “Question Concerning Technology”, though elements appear as early as his 1925 lecture on Plato’s Sophist and again in the 1939 lecture on φύσις (nature) in Aristotle–described this attitude towards nature as “technological thinking”: a kind of thinking concerned not with understanding what things really are, with gaining true knowledge or insight, but discovering what things are only insofar as they may be used in accord with some predetermined plan of our own devising.

The opposition of nature and culture, and the attempt of culture to dominate nature, as well as the separation of the cultural from the natural as intrinsically opposed forces, are not only epistemically unjustifiable positions, but positions which undermine coherent understanding of the human being.  The end result is one of human fragmentation.

Dreger, admirably though misguidedly, appears to be seeking something like unity.  Her piece desires a coherent account of sex and its place in human life.  The difficulty is that, in saying sex is primarily about pleasure, she makes a teleological (ends-guided/oriented) claim.  To speak of the end of an act, or a kind of act, requires an account of the contextual whole.

Klein, by contrast, removes sex from the integral context of human life and displaces it into a meaningless void.  His pretense at postmodernism ultimately succeeds only in being ultramodernism.

letusgothenyouandi
“Let us go then, you and I…”

Is human life itself meaningless–is all meaning invented, fabricated; the product of experience re-shaped into whatever we decide we ought to give it?  Could one aspect–particularly one so integral, central, both biologically and personally as sex–be itself alone meaningless within a meaningful whole?  My answer to these questions is no; I have explained why in other posts (“Why We Struggle with Meaning”, “Why We Know reality”, “Why Belief Matters”, among others–in a sense, all of my posts strive to answer these questions).

Rather, meaning is discovered through experience and subsequently interpreted.  Those interpretations may be good or bad, accurate or inaccurate; but not “right” or “wrong” as “completely correct” or “completely incorrect”, as a black and white, yes or no, binary system.

We do not, for instance, create pleasure for ourselves.  Rather, pleasure becomes known through the experience of an activity found pleasurable; pleasure is a consequence of action.  The pleasure of sex–at its most intense, a pleasure which submerges our capacity for structured, rational thinking under waves of arousal and satisfaction–is experienced, to be sure; we discover its meaning through rational reflection.  If our sexual behavior habitually engaged makes us less capable of rational thinking, we should find that our pleasures are misguided in reflection, as does any activity which lessens our humanity.

Who and what we are does indeed derive from our experiences; but among those experiences is that of being human, with a human nature; an experience which does not consist in a given moment of given biological limitations, but which permeates the entirety of our existence.  We are bound by an intimate, inescapable connection between who we are, who we were, and who we will be.  If we want to understand sex, therefore, as when we want to understand anything within human life, it must be placed not within the vacuum of abstraction, but within the living context of the whole.

 

princeton

Why the Majority of the Educated, Aren’t

…anything that disturbs the habitual somnolence of prevailing opinion is automatically registered as a despicable contradiction.

Martin Heidegger, Letter on “Humanism”

Under the light of the public eye, many well-educated people–professors, graduate students, researchers, ivy league graduates, etc.–proudly display their pedigree, but behind a thin veil of personal humility.  Demonstrating the merit of one’s education is difficult; subtle name-dropping is easy.  Someone may drop, in casual conversation, the phrase “While I was studying at Harvard…” or “During my internship with [famous researcher]”, wear or use branded paraphernalia (“That’s a nice folio–did you go to Princeton?”), or, depending on the context in which he or she looks to impress another, simply trust in Google; i.e., that LinkedIn, Facebook, or whatever social media profile will take care of the bragging.

Regardless of how: we often want others to know our educational history.  It often serves as a cornerstone of our personal development, and thereby becomes an element in our personal identity.  Curiously, the association often takes precedence over its effect.  That is, a graduate of Yale inclines to promote that he is a graduate of Yale over asserting that he learned great things and became a better thinker in college.

This thin veil of personal humility serves two purposes.  First, the audience, expected to know the school, immediately receives a sign of socially-constituted merit.  Second, the speaker can play the game of admitting a personal intellectual weakness, while associating himself with a presumed collective intellectual strength.  To highlight one’s own achievements (directly, at least), is gauche vanity; but intertwining those achievements with a Great Institution or Endeavor elevates oneself in the eyes of society.

To erase your doubt in this, look at any conventionally “well-written” resume or C.V.

The point here is not to bang the drum against so-called elite institutions.  Most would acknowledge that they do not guarantee a quality graduate–despite presuming quality upon meeting one.  Rather, this drum bangs against the belief that a collective intellectual endeavor is necessarily strong.

Anyone who has observed monkeys for long enough knows that the claim of an infinite number of monkeys banging on an infinite number of typewriters for an infinite period of time will eventually produce the works of Shakespeare is bunk (or anyone who has taken a basic statistics class, for that matter).  While human beings are not monkeys, the necessity of intellectual progress through group or collective endeavor, however, falls in the same category as the non-Bard primates.  The human intellect, though capable of arriving at the truth, has no guarantee of it; an infinite number of scientists and philosophers, thinking for an infinite amount of time, has no necessity of arriving at the correct understanding of the universe.

The astute student of history–not the historical revisionist–will understand that this lie emerges through a narrow consideration of the so-called scientific revolution, the Renaissance, the Enlightenment, and the rapid progress of discovery and engineering which accelerated the means of living in the twentieth century.  Narrow, I say, because this progress is the progress of practical possibility and not genuine intellectual comprehension.  It is true that a collective intellectual endeavor can (though it does not do so with necessity) enable us to do more; so much more, in fact, that we may annihilate ourselves.

When it comes to understanding, however, collective intellectual endeavors often undermine.  The phenomenon of the echo-chamber has become commonly known: that groups–today, especially, far-left progressive college groups–exclude all opinions and viewpoints contrary to their own and thereby hear nothing but the echoes of their own thoughts.  But echo-chambers are not always so localized or provincial.  Whole universities, academic fields, and society at large can become echo-chambers.  Subtle societal diffusion of a particular perspective possesses obvious rhetorical power: of course the gender binary is socially-constructed; of course religion is outdated; of course–and so on.  When the view of Thomas Kuhn–who noted that scientific discovery always advances in concert with a “paradigm shift”, the upheaval of one approach or perspective for something new–become common, and scientists began striving to out-do one another with discovery of new methods, approaches, or interpretations, the paradigm shift itself became a paradigm.

Our thinking gravitates towards patterns.  The tendency is innate and necessary for practical accomplishment; the particular patterns themselves are not.  But when these patterns dominate our theoretical endeavors, theoretical thinking suffocates.  One such pattern is the common striving for “objectivity”: that is, a kind of “God’s eye perspective”, i.e., seeing reality as completely independently of our own subjectivizing tendencies.  This pursuit of objectivity, however, stifles truly original thinking:

Language thereby falls into the service of expediting communication along routes where objectification — the uniform accessibility of everything to everyone — branches out and disregards all limits.  In this way language comes under the dictatorship of the public realm, which decides in advance what is intelligible and what must be rejected as unintelligible.

Martin Heidegger, Letter on “Humanism”

By the “dictatorship of the public realm”, Heidegger does not mean the tyranny of the mob; rather, he means the uncritical presumption of truth and meaning.  This uncritical presumption lays down intransigent lines of what is or is not acceptable thinking.  The political chasm between “left” and “right” today serves as an instance; each is regarded by the other as illogical and their positions unintelligible.  Each presumes itself as the unquestioned champion of what is objectively true.

That this occurs at the lowest level of today’s thinking, politics, does not preclude it from the highest.  The justifications for exclusionary thinking on the basis of a supposed objectivity become more sophisticated, but both the base political clamor and the refined scientific, educated, nuanced theories of the supposed intellectual elite fall victim to the same hubris.  Not to say: we should not disbelieve in the truth of our own convictions, but rather to say: we are not, any one of us, any group of us, the sole possessors of the whole truth.

In regards to genuine personal humility, the educated ultimately exhibit this hubris with much more violence to dialogue than do the uneducated.  The latter may not listen to others, but more likely admit their ignorance.  The educated will “listen”, but through the filters of their own education.  Whatever does not fit through the sieve of their experience, personal and intellectual alike, is disregarded, rather than examined anew.  As Heidegger says elsewhere, we hold ourselves towards beings, towards reality, with a kind of obstinate “in-sistency”: that is, we stand firmly in a particular and narrow interpretation of a complex and multifaceted reality.

Hence: the majority of the educated, are not.  Educate – ex ducere – to lead from.  From what, to what?  From the darkness of ignorance to the light of knowledge.  Anyone who has received an education has experienced this: the dawning of insight, the moment of intuition in which the meaning of a concept is grasped.  But simultaneously, knowledge is not something pure and independent of ourselves; knowledge can only be grasped according to the ability of the one grasping; it must always, in some way, conform to the “shape” of the knower.  The light of that knowledge, if captured and confined, if selectively used to illuminate, does not fulfill its purpose.

The majority of the educated are not because education is a ceaseless endeavor.  When we believe ourselves to have intellectually conquered the reality studied, we have fallen into a new pit of ignorance; we shut off the possibility of further discovery, further knowledge, further truth, further and greater illumination.  The majority of the educated are not because they think themselves to be.

heidegger-desk
M. Heidegger

With the assistance of logic and ratio often invoked, people come to believe that whatever is not positive is negative and thus that it seeks to degrade reason and therefore deserves to be branded as depravity.  We are so filled with ‘logic’ that anything that disturbs the habitual somnolence of prevailing opinion is automatically registered as a despicable contradiction.  We pitch everything that does not stay close to the familiar and beloved positive into the previously excavated pit of pure negation, which negates everything, ends in nothing, and so consummates nihilism.  Following this logical course we let everything expire in a nihilism we invented for ourselves with the aid of logic.

st-jerome-1

Why Academia is Gloomy

A brooding disposition permeates the hallways of academia unlike any other corner of the social world.  Questioning whether this gloom stems from the persons who constitute it or if it somehow exudes from the environment itself becomes a classic game of the chicken or the egg.  Likely, it is a mutual, reciprocal relationship: academic work, especially in the humanities, social sciences, and, in my experience, philosophy, draws persons who incline to brooding, which tendency receives constant nourishment from not only fellow academics, but the structure of academia itself.

Not every academic, of course, is a dark and dreary personality; there are bright and bubbly, happy, socially-outgoing, friendly persons wielding doctoral degrees with all the charm of an over-the-top anime character.  Brooders–who may not even constitute the majority of academics, but certainly form a disproportionately larger percentage within academia than within the general populace–tend to dislike these cheerful guppies.  Or loathe them.  This distaste is natural: how else would someone respond to seeing another gleefully engage in that which causes oneself deep psychological agony?

Regardless: why?  What is it about academia, the life which unfolds within the halls of universities, in libraries, in offices, in classrooms, that engenders this common dissolute and agonized worldview?

For one, the work of an academic–especially the doctoral dissertation–is a tremendous and solitary undertaking.  Friends, family, colleagues, advisers, directors, and psychiatrists can give emotional, moral, intellectual, and financial support; but no one can step in or pick up the slack on an off day, off week, especially when it comes to the lifeblood of scholarly-work, writing.  No one can step in and write your dissertation, that article, that book, that review.  Someone else may teach your class, but never really teach your course; and certainly not develop it.  Every academic, though he or she may partake in joint efforts from time to time, primarily engages in an essentially solitary labor.

It could not be any other way.  For all momentous intellectual endeavors, while not essentially private (as though done “in one’s head”), require a complex informational confluence; a convergence sui generis.  While every individual human being, academic or not, has such a sui generis experience–every life unfolds in a manner unique and unrepeatable–an enormity of rare, atypical information and thoughts stemming therefrom constitute the intellectual specialist’s experience.  This makes the intellectual’s pursuits inherently socially alienating.

The value of an academic’s work is evident to the one doing it; but communicating the value of that work to others, to those in the general populace, those in other fields, and even those within one’s own field, often seems nearly impossible.  For philosophers and those working in the social sciences, this can be particularly frustrating, for typically personal advancement and knowledge alone was neither the goal nor the reason for entering into academia.  Change upon society was the motivation; now, perched high enough upon the ivory tower to see accurately society’s ills, the only ears upon which our words fall are one another’s.  It takes years of intense study for the doctoral student to gain his or her perspective; how can the discovery of those years be shared with the world?

 To make matters worse, even succeeding as an academic–writing and defending a quality dissertation, being a good professor, teacher, mentor to students–does not guarantee success in the world; sufficient pecuniary remuneration cannot be assured.  Possibly because we annoy the hell out of people by using unnecessarily complex phrases like “pecuniary remuneration”.  Many academics are “failed academics”–not because they failed in academia itself, but because academia failed them.

So it is little wonder that the typical academic broods.

Assuredly, those who make it through, are hired for jobs with sufficient pay, and are able to make some impact on the world through their work, find it ultimately rewarding.

But it is a long, dark, gloomy road.  So be nice to our self-tortured souls, dammit.

dore-abandon-1

Why Belief Matters

Go, go, go, said the bird: human kind
Cannot bear very much reality.
Time past and time future
What might have been and what has been
Point to one end, which is always present.
– T.S. Eliot, Burnt Norton

The word “belief”, when spoken among the supposed intellectual elite of Western society, often carries with it either a tone of condescension–or worse, Fremdschämen.  “Belief”: habitually accepting, trusting or having faith in something; not having proof, facts, data, or experimental verification.  Poor country rubes.  The savages in Brave New World, being ceremoniously beaten bloody.  Tim Tebow.  Rubbing beads and muttering words half-consciously.  Old men in silly hats speaking to imaginary friends and waving around incense.  Eating a cracker and thinking it will provide eternal life.

Ridicule for the ridiculous.  Outdated rituals for superstitious, uneducated, scientifically-illiterate people.

The only reason to have faith–belief–is because you do not have a reasoned explanation.  The business of scientific progress is explanation; so why trust in a faith, why have belief, when we can trust in science?  Wait, “trust” is used in the definition of belief.  Rely on science.  *Looks up “rely”, sees “trust” in the definition.

Dammit.

The Meaning of Belief

The words used to signify some phenomenon often cloud our understanding with vague and misleading connotations.  For instance, the word “discrimination” now commonly bears the implication of injustice–even though discrimination is simply the act of distinguishing one thing from another and setting out their proper order.  Likewise, the word “belief” carries the connotations of naivety and superstition: children believe–in Santa Claus, the Easter Bunny, the Tooth Fairy, in magic, in monsters.  Belief is for children and ignorant hicks.

But just as someone can discriminate without being unjust, so too someone can believe without being ignorant.  “Belief” does not mean “blind faith”–it means accepting something as true for the sake of guiding our actions.  Even someone with mounds of evidence holds his or her claim as a belief.  That the earth revolves around the sun, that gravity is a force affecting massive bodies across distances according to an inverse square law, that water is wet, Jesus is God, outer space is cold, Miller Lite is less disgusting than Bud Light, and anything else which can be formed into a coherent propositional form is a possible belief.  Some beliefs are better established and less-contingent upon subjective dispositions than others: for instance, the revolution of the earth relative to the sun in contrast to one’s preference when it comes to bad beer.  But both fall under the category of belief.

belief consists mainly in being deliberately prepared to adopt the formula believed in as the guide to action.” – C.S. Peirce, “The Maxim of Pragmatism”

For the opposite of belief is not knowledge, but doubt.  Doubt is the dissatisfaction with our current understanding of something which drives us to seek the truth.  Perpetually unresolved doubt, therefore, would not serve even its own purpose; indeed, doubt desires belief.

Consequently, when we have belief–true, earnest belief, conviction–this belief determines and guides our higher intentions: i.e., our reasoned goals.  A man believes he loves a woman–that they, to one another, are complementary human beings who through their actions provide great benefit, that they make one another better, that they possess and can develop an intimacy which will aid one another in attaining a true and therefore lasting happiness–and so he intends to her a lifelong exclusive commitment.  This commitment is his goal.

The process whereby an individual’s beliefs come to be determined in the first place is undoubtedly a complex one, from the psychological standpoint.  Human lives are fraught with possible influences which could have helped to shape holding this rather than that.  Historical analysis of one’s own formation may be of introspective, private, personal interest; in the rare case, it will be of interest to others, and we hope those people write autobiographies.  What is universally interesting, however, is the question of why we hold to any given belief at any particular moment in time–especially beliefs which may be challenged.

The chief culprit in holding a belief, particularly a deeply unreasoned belief, is none other than comfort.  When a belief is shaken, it incites doubt.  Doubt causes mental agitation and therefore discomfort.  If a doubt possesses us, we constantly dwell on it, think about it, pursue its resolution.  The more important the subject of our doubt, the more intensely our duties, pleasures, and appetites are overshadowed.  Every conversion story you might find–from, say Muslim to Atheist, Republican to Independent, Anglican to Catholic, Psychology major to Philosophy, Atheist to Christian–is a story of doubt seeking belief, and a story of discomfort.

Holding on to our beliefs is a way of holding on to our comfort.  When a belief becomes societally unpopular, holding on to that belief is often less comfortable–hence the declining Christian population–than capitulating to something deemed more societally-acceptable–like declaring that you “Fucking love science” and retweeting memes with stick figures holding beakers.  The tenets of the Russian Orthodox Church might be a comfortable fit for someone’s identity; as might orthodox Roman Catholicism; or sex-positive feminist atheism.  We may find contentment not only in holding to certain ideas (dogmatically), but in identifying ourselves, at least in part, with those ideas.

But belief extends farther than self-identification and adherence to ideas; for by adhering to ideas, we determine the course of our actions.

Reasoned Belief and Irrational Desire

At times, our higher intentions come into conflict, either with one another, or with our lower.  In the former case, doubt is occasioned, requiring further reasoned inquiry for resolution.  In the latter case, a different kind of struggle arises, one for which the exercise of the will is necessary.  Perhaps the man, despite loving his wife, finds himself compelled with sexual attraction towards another available woman.  Insofar as his belief concerning love commits him towards sexual exclusivity, two different forces pull on him.  One is the force of belief, a product of reason and mental conviction; the other is the force of unreasoned desire, conjured not by structured thought, but by chemical reaction.

Sex, the marketing departments tell us, sells.  What the marketing departments do not tell us, because they either do not care or do not know, is that sex has the potential to sell, in every generation and to every demographic because that chemical reaction is a universal trait.  Appealing to a person’s reason is difficult, but appealing to a person’s unreasoned desires is easy.  We can see this in the way we speak: we “give in” to our desires, give way, surrender, indulge, and so on.

From time to time in human history, the capitulation to desire has found public endorsement, and not simply from ad agencies.  A little healthy catharsis, a bit of release, letting yourself be carried away, swept along: relaxing and going along with the stream, or torrent, of desire.  Images are conjured of women craning their necks back in ecstacy, men gripping buttocks tightly; passionate moans and gasps and oh my.  Sometimes this endorsement remains rather hush-hush.  Other times it earns a wink and grin.  Others still it becomes almost tired and obligatory: yes, do what pleases you, but just… not so loudly.

Regardless, it becomes a belief that submission to irrational desire has a rightful place in our daily (or, perhaps weekly, monthly) lives.  The idea is that it is reasonable to at times be unreasonable.

The paradoxical allure of the reasonable unreasoning crumbles once examined, however.  It is, indeed, reasonable to satisfy the unreasoning parts of our human nature, but only, it turns out, if done in what is itself a reasoned manner.  The man who gives in to his unreasoned, irrational desire and cheats on his wife, contrary to his beliefs concerning love, does so unreasonably.  The woman who goes off her diet and eats a half-pound burger with bacon and an extra large order of fries, because the occasional pleasure of unhealthsome but tasty food helps her to continue eating healthy, does so reasonably.

In other words, we can have rational desires; while the desire itself does not have reason, it can be reasoned, controlled by reason.  Contrariwise, we can have irrational beliefs: irrational either because they are held without reasoning, or because what is held is contrary to reason itself; or, worst of all, both.

Most beliefs are not subjugated to much reasoning, for the simple fact that we do not find occasion to doubt them.  Many of us have become adept at actively avoiding any such occasion.  Further, most of us are quite good at clinging tenaciously to our beliefs when we do stumble across such an occasion.

Suspicion of the irrationality of a belief, with which we identify ourselves and determine our course of actions, frightens us to the bone.  Why is it so terrifying?  Because we have been inculcated with a spirit of intellectual cowardice.

Correcting Flawed Belief

When sick with a vice, the cure is to overcorrect intentionally towards the opposite extreme.  Our unconscious behavior will be to move towards the vice; intentional overcorrection will balance out this unthinking tendency.  To attain intellectual courage, then, we need an almost reckless attitude of doubt.  We need to discern and acknowledge the unchallenged and perhaps dubious authoritarian claims we have accepted, and question whether or not they are truly reasonable.

That might seem an overwhelming proposition.  But examining our beliefs does not require a systematic inquiry into every last little issue, so much as the development of a critical disposition.

To take an example: the dogma that the global warming of the earth over the past 150 years has been primarily caused by human beings.  There is undoubtedly scientific consensus concerning this claim.  But two pertinent doubts can be raised.  First, is the consensus itself a legitimate cause for belief–i.e., are the client scientists in agreement because the evidence demands it, or because it has become the dogmatic opinion?  Having been on the unorthodox end of more than one opinion in academia–and knowing others who have been heterodox dissenters–it is not surprising any time a “conform or shut up” attitude is encountered.  Should we believe climate science alone is immune to the problems of flawed peer review systems, of replication, and of sloppy design?

Second, does this increase in temperature, even if caused by human beings, necessitate a dramatic and immediate change in human behavior?  In other words, the implication typically drawn from climate change is that we must cease use of fossil fuels posthaste, and live greener, and more ecologically-friendly.

Asking these two questions does not require that one systematically investigate the issue at hand.  Most of us do not have the resources, including time, to investigate the 12,000 papers on climate change published between 1991-2011 that are typically referenced in the consensus reports.  But, at the same time, we can reasonably reserve judgment about the veracity of their claims while acknowledging a fundamental truth behind them.  Living in a way more harmonious with the ecological balance of the planet is a good, intelligent, responsible thing to do.  Everyone ought to be conscientious about the use of resources, particularly non-sustainable ones.  This belief should accordingly shape our actions while we wait for resolution on the theoretical, scientific inquiries.

Sixty years from now, everyone could be clamoring against the use of electric cars, when new research suggests that they cause cancer.  The most important lesson to be learned from the science of global climate change is one of patience: let us not do things because we can, but because we should.

Moreover, and more importantly, doubting the dogmatic proclamations about how we ought to behave (which, notably, seldom come from climate scientists themselves but rather from politicians) is a reasonable posture to adopt–not as, say, part of the Trump-supporting cultural devolution that, in its rapid distrust of experts, distrusts any intellectual endeavor, but as part of a critical disposition.  It is the posture of someone comfortable with doubt of a popular opinion.

We face greater difficulty in entertaining doubts about more intimate and personal beliefs.  True passion about climate change is rare.  Passionate concern for transgender acceptance, or gay marriage, or anti-abortion legislation, or Christian belief–these are more common.  Violent repudiation likely meets the suggestion that such beliefs might be, to any degree, mistaken; or the words “agree to disagree” are proferred for the sake of avoidance.  Something like one’s beliefs concerning climate change has relatively little impact on most of our lives: it might impact which car you buy, or what kind of lightbulb you use, but not too much else.  Our political, moral, and theological beliefs, in contrast, determine the greater course of our actions.  As our beliefs in these realms change, so too change the natures of our friendships and romantic relationships, oftentimes our careers and our education, our sense of purpose in the work that we pursue.

“The person who confesses that there is such a thing as truth, which is distinguished from falsehood simply by this, that if acted on it will carry us to the point we aim at and not astray, and then, though convinced of this, dares not know the truth and seeks to avoid it, is in a sorry state of mind indeed.” – C.S. Peirce, “The Fixation of Belief”

Ultimate Belief

This is why the question of “God” matters.  Nothing is more fundamental to the orientation of a human life.  Pursue your questions about the meaning of life, its purpose, and how we ought to live, to their final end and always the result is one of two things.  Either meaning depends entirely upon the will of human beings, purpose is constituted arbitrarily through our determination, and the only restraints on how we should live are the equally arbitrary decisions of other human beings, i.e., the restriction imposed through a social contract.  Alternatively, the meaning of reality is at least partially, and in some sense fundamentally, independent of our wills, purpose exists in a metaphysically-constituted framework outside of our determination, and this framework provides for us the righteous direction of life.  In the former case, we arrive at nihilism; in the latter, theism.

To say that nihilism is a belief in nothing, therefore, means that action has no purpose.  This necessarily follows from the belief that meaning is entirely arbitrary or solely imposed from human volition; such an imposition is ex nihilo, from nothing.  It has no basis.  It has no foundation.  Therefore, it has no orientation, no end.  To claim that it makes oneself happy to do so is self-contradictory: for in that case, happiness is the basis of one’s action.  The meaning of happiness cannot be self-generated, for then there is an infinite regress of its origination.

Thus, the nihilistic attitude, because it is an ultimate belief in nothing, provides no determination or guidance for our actions.  Most professed nihilists do not truly live nihilistically: instead, they live semi-hedonistically or egotistically.  Truly believing in nothing is an abyss into which few fall.  It seems likely that most who do would become suicidal (I think, in this instance, of Jacques and Raissa Maritain).  If there is no purpose to living, if all of life’s pleasures and actions ultimately amount to nothing of lasting significance, but everything passes away, so why not pass away sooner rather than later?  Why allow it to happen in some uncontrolled, possibly painful way, when you can take it into your own hands?

Fortunately, nihilism is epistemically unsound.  A cursory examination of our less-than-ultimate beliefs will show us that they typically strive for acting in accord with the way in which things are independently of ourselves.  The meaning we attribute to these things is not something given wholesale from our volition, but contingent upon our understanding of those things.  Believing that water will quench his thirst, a man drinks.  Believing that reading will help her to understand, a woman picks up a book.  Believing that the bluelight from lit screens interferes with the production of melatonin, someone turns off the TV two hours before bed.

Our sound beliefs derive from realities independent of our wills.  What beliefs, then, do we hold about those independent realities themselves?  Where do those realities come from?  Why are they there; why are we here?  Are there no realities other than those we can perceive?  Or are those the only realities that we can know positively–in other words, can we positively exclude the possibility of imperceptible realities?

No one who has really pursued these questions, it seems, can reasonably be an atheist (without, at least, being a nihilist); whether God exists is a question that cannot be definitively answered in the negative.  The atheist position (presuming it as materialist) contradicts itself, for it claims that we can assert as impossible something which we have no means to examine.  It is egotistically anthropocentric in the extreme (or, perhaps, anthroponomic).  Agnosticism is a more reasonable attitude, for it at least acknowledges and abides by the self-imposed limits of a materialistic perspective.

But again, the strict materialist perspective is epistemically unsound.  Arguing for this in the simplest, most direct way: the universe is perfused with relations, relations of which we are aware, and these relations, though dependent upon material realities, are in themselves no way confined by time or space.  Your father is your father no matter how far away he is or how long it has been since you have seen or spoken with him.  He became your father the instant you were conceived.  Despite their light just now reaching us, many of the stars we see at night no longer exist.  That these statements are true–i.e., that the meaning they signify corresponds to reality–is a relation which cannot be discerned in any material reality.

In short, were we to sincerely confine our knowledge of reality to the materially perceptible, we would effectively eliminate all of our knowledge; for imperceptible relations are necessary constituents of every bit of knowledge we have, have ever had, or ever will have.

This alone does not prove that God exists; but it does seem to suggest, at the very least, that realities other than those we can perceive are possible–that we are even reliant on some such realities in our day-to-day living.  Further argument can be given that we grasp imperceptible realities other than relations, as well; following from this, that we ourselves are in some way include an imperceptible aspect to our constitutions; and that all of this leads us towards a being which is not dependent, contingent, nor magical and ethereal.  In other words, this track of thinking leads towards belief in God.

In the meantime, however, the point to be made is this: everyone has beliefs.  Our beliefs determine the course of our actions, including our further beliefs.  Doubt is the irritating gadfly of our minds, driving us to seek solid, reasoned belief.  Without belief in some supreme principle–some ultimate cause which determines the meaning and purpose of reality–we will inevitably end in nihilism.

In other words… belief matters.

dali-johnofcross

almacollege-fire-1

Why Academia Should Be Burnt to the Ground

The students glanced at one another nervously.  The first day of class is always awkward.  Some of them knew one another, some were new.  Pencils restlessly tapped against legs, notebooks.  Eyes darted to the door each time it opened.  Eventually, a man, somewhat tired looking, his beard a trifle unkempt, his clothes fitting, but somewhat outdated, shambled into the room, a stack of books in his arm, a briefcase dangling from one hand, two folders stuffed with papers–syllabi, handouts, a guide for writing good papers.

He huffed his pile of belongings onto the oblong, cherry-wood table, breathed heavily, looking around at the students.  “Good afternoon.  No time to waste: I’m Professor Krampus, and this is the Philosophy of Knowledge.”  His name was scrawled, nearly-legibly, on the chalkboard.  The syllabus circulated.  Handouts, the writing guide.  Students paid attention as the important points were covered.  Questions were asked about the reading–“Should we read any secondary literature on Kant’s Critique, or focus on the primary text?”–and answers given.

The professor sat at the head of the table, and led a thoughtful opening inquiry into the question of knowledge.  He ended with a discourse on the difficulty, and the importance, of the question, looking thoughtfully out the window, his own undergraduate class resonating subtly.  It was a beautiful day; a beautiful campus, cared-for but not manicured.  The buildings were a bit old, but had style.

Professor Krampus returned to his comfortable office, read his emails (an on-going departmental thread on the purpose of aesthetics, two from students in another class, one asking about something in Aristotle, the other, Plato) and got a cup of coffee, opening the docx file containing his manuscript.  He plugged away happily for a few hours.

Later still, he went home to his reasonably sized and furnished apartment, had a decent dinner, answered his emails (including a quip about Salvador Dali he thought was particularly clever), watched a bit of TV, and fell asleep at quarter to 1.

It is a lovely dream–not so much for what it contains, but for the unpleasant realities which it lacks.  There are no administrators or bureaucratic emails.  There is no shared office.  There is time to work on real philosophical thought.  The students are eager, willing, interested, invested.  No unnecessary tasks or distractions.

But it is only a dream, a sweet indulgence tasted between the bitter soul-crushing hours spent filling out Outcome Assessment Forms, attending faculty meetings and HR seminars, compressing, cutting–butchering–lectures to fit within a shortened semester schedule, listlessly writing cover letters and personal teaching statements, hope diminishing with each keystroke that a more satisfying job might be found. Continue reading “Why Academia Should Be Burnt to the Ground”