Why Academic Hiring Sucks Fat Ass

There is a terrible amount of caution in the hiring of academics, I think; and I really mean terrible.  Having now spent the past several weeks applying for non-academic jobs, where I hear back quickly with either a request for an interview or a rejection–or, lacking a quick response, can safely assume a rejection–I see with some greater clarity just how stupid the academic application and hiring process is.

I get that you don’t want to give a tenure track job to someone who is a bad fit.  Okay.  So maybe, instead of having super shitty, contingent, one-year positions and then mildly shitty, bust-your-ass doing every possible extracurricular service you can think of tenure track appointments, we find a nice middle ground: two or three year contracts with the possibility of acceleration into tenure-track based upon internal department reviews.  Relatively low-risk for the university and the department, good experience for young academics with sufficient reward (financial and CV-building) to do something like move across the country.  Let’s worry about whether or not they will be good, long-term contributors to your program after they get a trial; you’ve got a better chance at medium-term weather forecasting than you do long-term academic prediction, given that you have really almost nothing to go on in the latter.

Also, let’s cut back on the hyper-specificity of our application requirements, shall we?  Cover letters should be about the academic, not about how the academic will fit the mold you have created–since, oftentimes, that mold will only fit one or two people in reality, you can bet your sweet sorry asses that 99% of people applying for the job are lying through their teeth to make it sound like they are in fact a good fit.  In the same vein, can we forgot about the asinine and specific AOS requirements, particularly in philosophy?  The problem with our educational system is not that we lack philosophers who specialize in environmental ethics and also post-colonial feminism; philosophy by its nature is general, and philosophers ought to be something of generalists.  They’re the ones good at educating students to think.

Pretty sure we can throw out requirements for student evaluations, as well.

I know the root of the problem right now is not the departments themselves, but the administrations; and I know the administrative bloat is killing almost every school’s budget for new hires–but the drawn-out, painful, agonizing process of hiring probably is not helping individual departments’ positions within universities.  Maybe I’m just bitter because I’m fairly certain that if a department just gave me a chance, I could and would prove myself worthy.

Anywho… back to grinding away on a book no one will ever read, that probably won’t get me a job anywhere.

Advertisements

The Social Construction Gap

If you are reading this, it’s fair to assume you are familiar with the term “social construct”.  You are, after all, alive post-2016 and on the internet (unless this has been compiled into Krampus’ Greatest Hits: A Book of Wonderful Sagacity and published to worldwide acclaim, which I’m sure will happen any day now).  Though society has evidently been constructing things for as long as society has existed, the term “social construct” did not come into vogue until the last half of the 20th century.  I won’t bother with the history of it; rather, I just want to point out that, as a formal concept, it is rather young.  It should therefore be little surprise that the meaning of “social construct” is pretty poorly understood–especially since the concept is intrinsically incoherent.

At the heart of this poor understanding is a radical divide: that is, a divide between nature and culture, or between biology and society, if you prefer.  The former pairing–nature (a broader term) and biology (a narrower one)–is recognized as important and, in a certain sense, “infallible”, but also considered stupid and clumsy.  That is, biology is what it is and actively seeking to change it carries a taboo (and for good reason–but more on that some other time).  Meanwhile, society appears sophisticated, “rational”, but often cruel, oppressive, and evil.  Biology is often believed an unconscious influence or even determinant on the structuring of society–sometimes to the point that it is responsible for the evil and cruelty of those social constructs.  At other times, society is believed to have “corrected for”–or at least to be capable–the “mistakes” or “clumsiness” of the biological.

Among the oddities surrounding this divide is that it appeared before anyone successfully made sense of it, and, point of fact, no one really has made sense of it yet.  Some intellectuals noticed that nature and biology form one kind of thing, and culture and society (or perhaps, even more ambiguously, “psychology”) form another kind of thing, and the two were pushed apart until we could figure out what to do with them.  We still haven’t figured much out, however, and I’m afraid we’re actually getting farther away from figuring it out, too: mostly because they should not have been pushed apart in the first place, and now people are carrying on as though this is the normal, proper state of things.

The idea of the “social construct”, as commonly understood today, has followed from the forced separation of nature and culture: that is, social constructs are understood to be concepts or institutions (i.e., patterns of practice and organization) which exist solely on the basis of human mental activity and formed discursively through social interactions.  Often, these constructs are imbued with moral normativity.  In other words, the construct establishes a standard or acceptable set of behaviors; anything outside of them is considered bad.  This seems quite sensible, given the premises–after all, biology and nature present no moral codes, but we have such codes, and if they don’t come from nature, then they must come from society.

Practices such as marriage and its traditionally attendant attributes–monogamous, heterosexual, child-bearing and rearing oriented–are often considered as examples of social constructs, and as ones having morally normative weight.  Within that, in particular, we can focus on monogamy, a normative concept currently under some scrutiny and facing opposition from certain quarters, which will be used throughout this essay as a particular example.

Disclaimers

First, I’ll be quite honest: I am a man and I am wholeheartedly pro-monogamy.  It is a practice I consider a part of the highest ideal of a loving relationship.  In recent opposition to monogamy, I do feel threatened–as would anyone, I think, who finds his or her ideal being argued against; it is something in which I believe, and in which I would like others to believe.

Second, the impetus for this article is the work of Carrie Jenkins.  In particular, I have in mind her article “Modal Monogamy” (I know she has written a lengthier book, and I intend to address it more fully and thus more fairly at some point).  Carrie is a clear, persuasive writer.  I follow her on Twitter and from what I’ve seen, I quite like her candor, her thoughts on mental health and academia, and on academia in general.  But I think she’s wrong in her work, and I think it a fruitful exercise (if nothing else) for me to explain why.  For starters, she tries to bridge the gap between the biological and the social–but as we’ll see, that gap is an abyss, and no bridge is strong enough to span it.

The arbitrary hypothesis

Typically attending the claim that a traditional or conventional position of moral normativity is nothing other than the product of social construction is the implication that this normative construction is oppressive, and therefore bad, and, since it is moreover arbitrary, it is also legitimate to supplant.  The first claim, that the norm oppresses, requires a view of human nature that centers around negative freedom as essential: namely, that radical autonomous independence from external constraints forms the core of what makes someone human, and any activity contrary to this–except in service of the preservation of another’s autonomy–constitutes oppression.  Good actions are actions that promote the ability of individuals to exercise their autonomy; oppressive actions are, by contrast, evil.  Because all but the most basic of norms entail some degree or another of constraint, sooner or later, they all come to be seen as oppressive.  I will challenge the validity of the oppressive-claim in the next section.

In the meanwhile, I want to note an inconsistency between the first claim, that normative constructions are oppressive, and the second, that they are arbitrary.  That is, if a social construct is entirely arbitrary, it seems absurd to claim that it is oppressive.  If whatever is being oppressed has a claim not to be oppressed, then the social construct must be something bad; which is to say that in whatever realm the social construct is operating, it cannot truly be arbitrary because it is itself in conflict with something normative, unless, of course, the claim not to be oppressed is itself equally arbitrary.

Bosch_bigButtStuff
Shit gets real weird.

In order for something to be arbitrary, or to be performed arbitrarily, it must have no inherent relativity to anything else.  That something could exist in an arbitrary fashion is a very dubious proposition–for everything, insofar as it is, seems also to be in relation to other things.  For an action to be performed in an arbitrary fashion, the act must be severed from its ordinary context; for actions, too, insofar as they exist, exist in relation.  But human beings do have the ability to artificially, within extreme limitations, impose barriers between an act and its relatives: as when a dictator arbitrarily decides to burn down the impoverished parts of his city to expand the beautiful and wealthy–this is not entirely arbitrary, but is arbitrary in the sense that it severs its relations to the proper ordination of ruling.

So let us take the supposed social construct with normative ruling highlighted earlier, monogamy, and consider what it would mean to say that this is arbitrary: that, against the proper ordination of sexual relationships, some group or force–likely the ubiquitous bogeyman, patriarchy–has imposed on the social order this rule or law.  One could argue that this imposition stems ultimately from a biological male drive to protect its progeny, or from a kind of “selfish instinct”.  Conversely, one could argue–as some have–that females possess a biological drive to polyamory, to increase their chances of reproduction and to always choose the best mate possible.  [Point of fact: anisogamic reproduction has typically fostered, in mammal species, male promiscuity and female selectivity.]

That the terms “biological drive” and “instinct” are virtual nonsense–to the point that even when clearly defined, which is rare, they possess no actual significance to real forces in nature–seems not to bother many who employ them.  Rather, they serve as “explanatory principles”, said in the pejorative sense, by which I mean unknowns presumed as unquestionable “first principles” invoked to bring an end to a discussion.

At any rate, the supposed arbitrariness of the monogamous norm is based on the premise that the male biological urge is, for no good reason, given formalized preference over the female.  Although, if we turned this around the other way, it would be equally arbitrary.  If an arbitrary patriarchal imposition is oppressive to the biological impulse of the female, the converse “matriarchal” imposition would be oppressive to the biological impulse of the male.

Of course, those who are arguing against monogamy are not actually arguing for anything matriarchal–polyamorists are not specifically interested, as near as I can tell, in procreating with a multitude of partners (and it would not be great for the gene pool, in the long run, if they did).  The real interest is in sex with a multitude of partners, and sex which is severed from any reproductive associations–which desire seems not to be related to this supposed biological impulse, which, in evolutionary terms, serves the continuation of the species.

This is a common sleight of hand for social constructionists: claim that what is socially constructed is arbitrary, oppressive, and point to some biological basis as justification for an alternative social construction–which it turns out is equally arbitrary and oppressive–which biological basis has no actual connection to the newly-proposed paradigm.

Foundations and morality

So what is a social construct?  Let’s start with what it isn’t: an arbitrary concept or institution on which a group of old white men sat around a table and developed as a tool for oppressing women and non-whites.  Nor is it a habituated thought or idea which human brains have co-incidentally evolved to create for themselves to make sense of otherwise irrational experience–the “survival mechanism” thesis (in fact, I would say very few human concepts are of this sort; it would be extremely inefficient, evolutionarily speaking–nor is efficiency the sole criteria of “good” or “better”, in this case).

Okay: so why is it a social construct?  That is, what exactly is being “built” or “constructed”?  The immediate implication in the term is that the result is something artificial, which is something that would not come to be in the way that it is on its own; intervention is required, and specifically intervention which reorganizes things other than how they would normally be.  Implicit is a notion of violence–not physical damage, necessarily, but forcing things against their nature.  A skyscraper does not organically grow, but its materials must be forced, re-shaped, welded, altered again and again, and balanced against one another, in order to stand: this process is construction.

Alright: so why is it a social construct?  In order for something to be social, it requires the co-operative interaction of a plurality of individuals.  Even war requires some co-operation (slaughter is not a social activity), albeit a very hostile kind.  The plurality has to be engaged in something common.

Well then: what is the social construct?  An artificial concept or institution co-operatively forced into existence by a plurality of individuals.  We can only accept the existence of social constructs if we already accept that society exists as something separate from biology; that is, social constructs appear valid only if we allow as normal the divide between nature and culture.  Why should this be the case?  Why should the separation of the two be the default position?  As aforementioned, there are different results from the two–but, here’s the counter-hypothesis: society is natural.

That is, society may deviate from nature, it may oppose nature, or it may cohere with nature, enhance nature; but its root is nature itself.  Human beings are naturally social–it is an inexorable ordination of our biological, physical, corporeal being that we exist in relation to other humans.  That things produced socially would all be constructs and therefore artificial presupposes that there are no natural ways of conceptual or institutional social production.

For this reason, I prefer the term “social constitution”, as “constitution” is a broader term than “construction”–it can mean either natural or artificial.  Moreover, social constitution, as a process of developing concepts or institutions within society, has a continuity with the individual process of ideation whereby we develop our individual conceptual frameworks, the key difference being the making-public of these concepts by species-specifically human linguistic communication; but I think, there, I am wandering a bit outside of the pre-defined boundaries of this essay.

Following from the separation of biology and society, and the relegation of all norms to social constructs, is that morality falls in the gap–or, as it really appears here, the abyss.  If moral norms are entirely the product of human intervention, then they really are arbitrary, whether they are based on biological facts or not–after all, the biological facts instill no preference in us, except, perhaps, to favor the ones that favor us.  If, on the other hand, moral norms at the very least can revolve around an understanding of the natural, then we can find a non-arbitrary basis for judging our practices right or wrong.  That is, normativity cannot reduce to the strictly biological, but it can develop in coherence with the natural.

Love and monogamy

In her book, What Love Is and What It Could Be, Jenkins defines “romantic love” as “ancient biological machinery embodying a modern social role” (p.82)–and here it is, specifically, that I think she tries to bridge the abysmal gap between the mistakenly-separated nature and culture–suggesting that we have an evolutionarily-instigated impulse to pursue sexual intimacy (as a reward system to promote procreation) which has been transmogrified through societal structures into something else.  It is this “something else” which is at issue, being a supposed social construct.

In her article, “Modal Monogamy”, Jenkins argues against the concept that “dyadic exclusivity” (i.e., two and only two people exclusively involved with one another) is essential to a romantic love relationship (“modal monogamy”–that the only possible love relationship is metaphysically constituted by such a dyadic exclusive romantic love).  Central to her thesis is the concept of “romantic love” as “(at least in part) a socially constructed kind”.

In short: Jenkins’ argument against normative monogamy depends upon her concept of romantic love, and her concept of romantic love depends upon the divide between society/culture and the biological/natural.  Consequently, given my position that there exists no such divide (except by an artificial separation of the kind the scholastics called a distinctio rationis, subsequently and mistakenly presumed to be a distinctio realis), I have to oppose her definition of “romantic love.”

To describe this phenomenon, with which we are all familiar as a feeling, as a social construct fulfilling a primordial biological impulse is, to be frank, silly.  It rests upon some notion of the “natural human”, a pre-societal, pre-cultural being of which we have nothing but the scarcest knowledge (since such a being is, by its very definition, pre-historical, and of which therefore we have only forensic and no semantic indications) as well as a notion of culture and society as things which are artificial and wholly separate from whatever is natural.

Rather than try to deconstruct the feeling of romantic love into separate and mechanistic biological and societal causes, let’s examine the phenomenon itself: that is, what is the feeling itself?  Succinctly described, all “love” has as its core a desire for union with the beloved.  We are pained when that feeling is frustrated and pleased when the feeling is satisfied (unless, of course, the expectation was exaggerated beyond the reality, in which case we are disappointed).  This is true of the love we have for a piece of pecan pie (attended by a scoop of vanilla ice cream, of course) as it is for a pet, as it is for a human being.

Obviously, however, the kind of union differs for each object.  I do not want to eat my pet, and I do not want to have sex with pecan pie (and anyone who does is not really seeking sex, but masturbation, which is a different thing altogether).  Identifying the kind of union that we want in romantic love will go a long way to helping us understand what romantic love really is–and why and how monogamy is a legitimate moral norm.

[Incidentally, there is a weird inversion which occurs in Jenkins’ “Modal Monogamy” article, in which she points out–correctly–that many people base their modal monogamous belief on (or conflate it with) a moral monogamous belief.  It seems quite clear to me that whatever one’s moral position on monogamy, it ought to stem from its ontological status, not vice versa.  But I’ll spare you all the digression into an Aristotelian/Thomistic investigation of act and potency… sorta…]

Two kinds of union are patently desired in romantic love: sexual and emotional.  To some extent, the emotional often seems more important: sexual union tends to result in relatively short-lived pleasure, whereas the emotional pleasure of an intimate relationship tends to last much longer, and to be more pervasive in one’s life.  But there is also a third kind of union, which we tend to do a poor job of recognizing, which is union in thought.  It is uncomfortable when we discover that our romantic partners–intended or actual–have beliefs which oppose our own, or want things in life which we do not.  While we do not want to be of literally one mind, we do want to be headed in the same direction.  Our beliefs and our desires orient the trajectory of our lives (to at least some extent), and having different trajectories means that, sooner or later, the union will dissolve.  A failure to attain union in thought results quite often in a separation at both the emotional and the physical levels.

None of this, of course, proves that monogamy is a metaphysical necessity to the feelings of romantic love.  It is unquestionable that someone can simultaneously desire unity with a plurality of individuals at all three levels.  What is questionable, however, is whether someone can successfully accomplish such unity with a plurality of individuals.

I think the answer is “no”.  You can certainly have strong emotional attachments, sexual attraction, and even conceptual unity, with a plurality of individuals.  But the conceptual unity has to be broad, and therefore is generic; the emotional and the sexual unity, if with a plurality of individuals, is inescapably fragmentary.  That is, if I am sexually and emotionally involved with more than one person, none of those people are receiving the fullness of my sexual or emotional self.  The unity is only partial, and being incomplete is metaphysically inferior to being complete.

In other words, to take a phrase from Karol Wojtyla, I think that romantic love requires a “total gift of self.”  If someone I love is having or acting on desires for someone other than myself, then a part of her is not being given, and the union is incomplete; and vice versa.

So while it is absolutely true that there is no metaphysical necessity behind monogamy, and that the normativity of monogamy is a socially-constituted institution, it deserves to be a norm because it directs us towards a better fulfillment of our natural desire for complete union.  And before anyone objects, “Ah, but my desires cannot be fulfilled by sexual/emotional/intellectual union with just any one person”, that is absolutely true–but they also cannot be fulfilled by a thousand sexual, emotional, or intellectual partners.  Desire is by its nature indeterminate, general, and open to always more and other.  You can always want something more or something other than what you have–especially if your desires are not subordinated to the belief that you can have a more perfect union through monogamy.

Shitting Bull

I said above that social constructionists commonly claim that what is socially constructed is arbitrary, oppressive, and subsequently point to some biological basis, as justification for an alternative social construction, that has no actual connection to the newly-proposed paradigm.  A problem with this is that the social constructionist thereby provides a false but seemingly legitimating claim, which then becomes adopted and part of the conceptual framework for individuals and social groups.  Because it is presumed as normal that culture is one thing and biology another, and that there are no real or essential connections between the two–such that we can establish non-arbitrary norms–we start to believe some real bullshit.

And if we’re going to stop believing in the bullshit, that means we need to understand that culture comes from nature, and can either cohere with it or contradict it–and if we want our culture to be coherent with our nature (presuming that being coherent is superior to being incoherent), we need to understand what human nature really is.  Which ultimately entails understanding how we understand–not an easy task and something really difficult to communicate.  But at any rate, until this monumental undertaking can be accomplished, we can probably help ourselves in the meanwhile by closing the mental gap between the biological and the social, rather than trying, with great but foolhardy earnest intentions, to bridge it; for all our social constructions ultimately crumble into the abyss.

Divorcing Academia: Freedom or Despair?

I have spent two years on the academic job market and had only a very few first-round interviews, and one campus visit.  I did not bother counting how many rejections; let us just call it a large number, nearing if not past 100.

Additionally, I have applied for at least 90 non-academic jobs in the same time-frame, probably more.  There, the closest I came was the final three for a job I had to convince myself I wanted, and, in the aftermath, am (still) very glad I did not take.  Admittedly, I have turned down many interviews, because I applied for the job without first investigating the company, and, investigating after receiving the interview request, said, “No thanks, charlatans/hucksters.”

But my heart has never been in applying for any non-academic job.  I am not being an arrogant son-of-a-bitch–at least, not too much of one–when I say that I am good at the meat of academia: I have three published articles, two (soon to be) published books, two book reviews, and three articles currently under review; another article almost ready to submit, and a third book for which I have about 50,000 words written, all of this in less than a year from defending my dissertation.  Granted, my articles are not published in “high-impact” journals.  Being a mixed medievalist-phenomenologist-semiotician means I am on the edge of everyone’s acceptable content-range.  Nevertheless, I think it’s hard to say that’s a bad publishing record; my books, at least, are both with highly reputable academic publishers.

I am also, from most of the feedback I’ve gotten (students and peers alike), a good teacher.

And yet, still, no one wants to hire me.

Expectations_grande

It could be that my doctoral program isn’t extremely well-known; but it is known somewhat, and respected–enough that I should receive attention from at least someone, somewhere.

It could just be that the market is particularly bad for my AOS/AOC this year–though the tendency away from metaphysical and towards “practical” topics of philosophy seems unlikely to ebb any time soon–and that, in a few months as the 2018-19 cycle starts, the market will be flush with jobs fit for me.

But I cannot wait for that.  It seems I may have to break up with academia… and I mean really and truly.  Not just “taking a break” and “exploring my options”.  I think I need to acknowledge that she is a faithless bitch and cut my ties.  Will this be freeing?  Will it cause me to despair?  Can I ever really be happy outside academia?  My thoughts, my research, my writing, my teaching; in some sense these seem like children, to me.  If I divorce the academy, will I still be able to be a father to them?  Or is it that I can only see them on the weekends, perhaps the occasional weeknight?

I do not think I will really feel free, to be honest, because the commitment is not externally imposed, but the consequence of an internal desire.  I would have to change myself; and I do not see that happening.

I interviewed for a last-minute, poorly-paying, one-year instructor position earlier this week.  It is a marginal step up from being an adjunct.  I’ll know their decision in a few days.  I have a couple other, long-shot applications out on the market, for which I have no expectations.

An Incoherent Imperative

Philosophy is not an empirical discipline and it cannot be conducted as such.  Though it may make use of empirical observations, and although many philosophers insist upon an empirical origin of our knowledge, the process of philosophical reasoning itself is not contained within the boundaries of an empirical nor an empiriometric methodology.   In point of fact, empiricism without the context of philosophy is an incoherent approach to understanding anything.

Among the most important of philosophy’s lessons is that the more we know, the more we realize that we do not know.  True comprehension is an elusive goal; and, yet, today, an arrogant presumption about the extent and profundity of human understanding is taken up in every quarter.  It cannot be doubted that the amount of information available far exceeds what was held by previous generations, but to confuse this information with knowledge (a habit), let alone understanding (an act), is to confuse the severed parts with the whole.  A pile of information no more makes understanding than a clump of atoms makes a human being.

The notion that empirical science will provide knowledge of any kind, therefore, seems rather misguided; for the empiriometric method provides us the means to gathering information, but not to incorporating it into the narratives whereby that information is made meaningful.  It is ironic that Western civilization purchased the means to improved information access and collection at the cost of the means to understand it.  This is especially true in questions of morality: what is revealed by empiriometric inquiries can be manipulated to fit whatever narrative, promoting or denigrating whichever agenda, and therefore gives us no foundation for saying what is right or wrong.

So how are we to reconcile the discoveries of science with our moral sentiments?  Of foremost importance is that we insist upon an approach to morality which makes of it something other than mere sentiment.  Empathy, like all feelings, follows the determinations of cultural contextualization.  That someone might feel empathy for another human being, or at least act with sympathy, in an idealized transcendence of distinctions of gender, race, sexual orientation, religious belief, and so on, is not a given.  The hows and whys of our positive and negative affections are indeterminate and open to nigh infinite degrees of alteration.

That said, I think it prudent to respond to three claims recently made by Brian Boutwell on Quillette.com:

1) We do not need specific versions of empirical realities to exist in order to realize that we’re all capable of suffering, that the properties of our central nervous system equip us to feel sorrow and anguish. We already have every reason we ever needed to advocate fairness and decency.

I do not mean to belittle a fellow academic, but this first sentence–and I promise you, it is not improved by context–is downright silly.  While Boutwell is doubtlessly trying to say that we do not need an empirical proof of equality to demonstrate our universal capacity for suffering, that universal capacity is itself a kind of equality shown by empirical evidence.  In other words, the claim is that we do not need an empirical account because we have an empirical account.

The second sentence is a more sophisticated piece of sophistry.  Everyone likes pleasure and no one likes pain (except those fun individuals who derive pleasure from pain; and even then, the pain is liked only because the derived pleasure outweighs the pain).   But this preferentiality gives us no reason, as such.  Why should the universal capacity for suffering be sufficient reason to advocate fairness and decency?  The statement resounds with the philosophies of the Enlightenment–Mill, in particular, comes to mind–but the classical liberalism theory of natural human rights rests always either on a belief in empirically-discerned nature (discerned, at least, in outline), or on a belief in the divine granting of human rights; or, in the counter-Enlightenment, solely upon social convention.

2) My hero Charles Darwin, by all accounts a good and decent man, stole from us the idea that we are products of special creation, made in the image of a loving god. We were clumsily assembled piecemeal by an emotionless process of selection. And yet, our worth—the worth of all human beings—is not diminished one iota because of it.

botticelli-aquinas
“I don’t think that word means what you think it means.”

Anyone who thinks that the Darwinian theory of evolution undermines belief in a divine creation does not understand the idea of creation as presented in any of its more intelligent articulations.  You might as well say that the Big Bang disproves that there could be a God (Fr. Lemaitre would disagree with you on that, I’m sure).  I am quite certain, at any rate, that Thomas Aquinas–who admitted that the eternal motion of the celestial spheres was merely an account which saved the appearances, and not a certitude at all–would tell you that the how of a divine generation of human beings, of whom the image and likeness of the divine consists essentially in species-specific human intelligence, is of little importance.

 

At any rate, the idea that our worth is not diminished by discarding the belief that human beings are of a special divine provenance sounds like boastful posturing.  As aforementioned, the classical liberal belief of the Enlightenment in human rights requires either that they be divinely granted or grounded in nature.  Otherwise, we must turn to a contractarian position, in which social agreement alone grants us this “worth”.  The dangers of such a turn should be quite evident; society is fickle, and the culture by which any society is united, malleable, open to perversion every hour of every day.

3) We should be united in the idea that nothing in science will overturn the imperative to treat all individuals as sentient creatures capable of feeling great happiness, and also great suffering. We do not need the natural world to exist in a certain way in order to ensure the moral worth of all creatures who inhabit it.

At this point, I must say–as a graduate professor of mine often would–that I’m confused.  I am pretty certain that either the contradiction has come full circle or a really unjustified and extraordinarily dangerous claim has been.  In the first case, Boutwell is saying that because the natural world exists a certain way we do not need the natural world to exist a certain way; because we exist with a capacity for “feeling great happiness” and “great suffering”, by nature, we do not need–what?  An equal capacity for such feelings?  (Then, I ask, why is it that people should be treated equally if their capacities are unequal?)  Or perhaps I am missing something?

Or perhaps it is the second case: that Boutwell is saying we designate human beings equally worthy and thus deserving of fair and decent treatment out of a pure volition.  That we should be nice to one another because we want to; or because we fear what will happen to us if we do not.  If the only reason to advocate for fairness and equality is the fear that failure to do so might result in our being victimized in turn, then Glaucon and Adeimantus would like to sell you Thrasymachus’ house.  If, contrariwise, the only reason to advocate for fairness and equality is that we want to, that seems no stronger a reason than to advocate for survival of the fittest, for right of the strongest.

If we divide nature from culture (as do counter-Enlightenment figures, and which divide grows organically out of the philosophical presuppositions of Enlightenment figures, as well), and place the value of human life–and all the consequent considerations of a moral kind–squarely in the realm of culture, we cannot be surprised when there develop certain cultural perspectives which attack our own.  For culture stands on no firm, timeless, or cognition-independent grounds, but begins anew in the cognitive life of every human individual.

Perhaps what it needs is less sentiment, such as permeates Boutwell’s piece, and more reason; a reason granted not by empiriometric science, but by philosophical inquiry.  Perhaps if we want to understand what makes human beings worthy of being treated with dignity, we should try looking for a way to make the empirical meaningful.

In Defense of Postmodernism

Words, as they are used, often do not make sense.  This does not, however, make those words’ uses nonsense.  If I say, for instance, that “we are in deep water, so we had better get a shovel,” this does not make sense, but it is not nonsense.  The mixing of metaphors confuses a meaning, but it does not deprive the sentence of meaning altogether.

In light of this very simple premise, I am going to make two points about postmodernism:

  1. Most of what is called postmodernism, including a great deal of gender studies, does not really make sense, but neither is it nonsense.
  2. Most uses of the word “postmodern” are themselves nonsensical.

Our first story begins with a wide chasm: on one side, there stand the “ordinary” people, including the empirically-minded, scientifically-grounded.  On the other side, we find so-called postmodern academics, residing predominantly in the humanities, especially literature, and the social sciences.  Each has studied increasingly complex and semantically-sophisticated texts for at least a decade, and every new conceptually-laden term, of which there is an ever-growing dictionary, widens the chasm.  This canyon of divide was started quite a few hundred years ago, by men of the Enlightenment, such as Thomas Hobbes, David Hume, and Jean-Jacques Rousseau–even though these men would likely find the present-day beliefs of the so-called postmoderns to be unintelligible–for they held as a premise that nature (or fact) and culture (and value) were essentially different and unrelated spheres of human activity, knowledge, and existence.

But empiricist Enlightenment writing, even at its most abstruse, seldom strayed far from the vernacular.  It held to a belief in human nature, and in the existence and importance of nature generally, even if it thought this nature was not essentially connected with culture.  Their concern in this separation was a practical one: which of the two separate spheres will control the other.  Contrariwise, the postmodernists, though still concerned with control, have wandered quite far from common language and nature both.  The linguistic obscurity can quite probably be laid at the feet of Martin Heidegger, who should be considered one of the fathers of a genuine postmodernism.  The denial of nature is a more complex thread, but one which develops out of Darwinian evolutionary theories, positivism, and the radical events of the twentieth century through which culture rapidly complexified, distancing itself even farther from the natural in ever more-oppositional ways.

I won’t bother with nature, here–too complex.  But language…

Heidegger was a controversial figure from the first day he came into the public eye (or perhaps even before).  His first major work which was published, Being and Time (1927), quickly became famous and quickly confused a lot of people.  It speaks, as you might expect, of being and of time, but with a radical new presentation which sparked no small amount of debate, including whether his work was nonsense.  Such accusations were not rare.  After all, the man wrote sentences like this: “World-time is ‘more Objective’ than any possible Object because, with the disclosedness of the world, it already becomes ‘Objectified’ in an ecstatico-horizonal manner as the condition for the possibility of entities within-the-world” (Sein und Zeit 419/471).  To most people, I imagine, that sounds like gibberish.  To someone who has made an extensive study of Heidegger’s other writing–especially his courses and lectures from 1925-1930–it makes more sense.  But even with such study, that sentence poses interpretative challenges.

Nevertheless, Heidegger captured the attention of countless thinkers in the twentieth century.  His thought was new and exciting.  He had an aura of mystery that was not squelched even by a postwar suppression of his ability to teach, on account of his affiliation with the Nazi party.  Thinkers still flocked to his work and often strove to meet him when they had the opportunity; even Frenchmen who had suffered at the hands of the Germans, such as Jean-Paul Sartre, who considered himself Heidegger’s ally in thought.  But many others, even those who rejected his ideas, adopted his manner: especially the school of Critical Theory, founded by Max Horkheimer–a contemporary of Heidegger’s, who attended some of his lectures in the 1920s.  Though Horkheimer fundamentally disagreed with Heidegger, he and his school–comprising such members as Theodor Adorno, Jürgen Habermas, and Herbert Marcuse–nevertheless took lessons from Heidegger’s treatment of language.  For much of Heidegger’s philosophy revolved around his re-interpretation of common words, his (oftentimes disputable) interpretation of their etymologies, and the easy possibility to construct new, multi-dimensional, compound words in the German language.  His was a search for words that would make sense, of difficult and abstract concepts that did indeed deal with multiple dimensions.

But those who followed his style, if not his thought (and many who followed aberrant interpretations of his thought), did so often from the perspective that culture is entirely a social construct, and has no relation to what is natural; from the presupposition that human persons are primarily what exists in the cultural realm, and what they are as biological ought to be forced to adapt to the cultural.  Add in a heavy dash of Marxism–keeping in mind that the school of Critical Theory grew up in Nazi Germany, against which one of the cultural forces was Marxist Socialism–as well as the linguistic semiology of Ferdinand Saussure, and the postmodernist “Theorists”, capital T, are born; or, rather, socially-constructed out of intellectual artifice.

The resulting jargon-laden books and articles are immensely difficult to understand, and when understood, often lack clarity or precision.  They frequently do not make sense because the objects of their reference have no genuine grounds, like a continual mixing of highly abstract metaphors.  But they are not nonsense.  The word “performative” sounds silly, because it is, particularly when applied outside the context of artistic performance.  But it is not nonsense.  Nor is talking about “performative masculinity”; all this means is acting in a way which corresponds to a concept of what it means to be masculine.  That actually makes sense.  What does not is the presupposition that masculinity is a purely social, artificial construct, and that therefore masculinity is itself constituted through such performance.  But even that is not nonsense.  It is the result of a theory which, while wrong, is not unintelligible.  Presupposing the truth of its foundations, it is quite coherent, and would make sense to someone well-studied in its concepts and terminology.

Consequently, Peter Boghossian and James Lindsay, in denominating their attempt at a Sokal-style hoax, “The Conceptual Penis as a Social Construct”, as nonsense (a word they use 17 times in their post-hoc report) is counterproductive hyperbole.  The Sokal hoax and its follow-up in Fashionable Nonsense likewise failed to aim at the correct target: for while Sokal more poignantly needled the terminological ambiguities of Theory, and particularly exhibited its readiness to accept whatever means promoted its agenda, especially support from science, the problem with Theory is not the conclusions that it reaches (and there are plenty of serious Theorists who do not promote the absurdities you will see from New Peer Review), but the foundations on which it rests.

And so this brings us to our second story: for the foundations on which so-called postmodern Theory rests are not really postmodern at all, in any meaningful sense.  To be post-modern would mean to be after what is modern; and from an intellectual, philosophical, academic point of view, modernity is defined by its first and crucial presupposition: namely, that what we know are our own ideas.  In other words, the direct object of my knowledge is the idea that I alone possess.  Knowledge is therefore essentially private, subjective, and only incidentally and occasionally capable of being shared with others.  This premise is equally true of Cartesian rationalists and Lockean empiricists, Humean sceptics and believers in Berkeley, and it is true of the vast majority of so-called postmodernism, as well.  Mathematical representations or organizations of empirical observations alone consistently avoid the subjectivization of experiential knowledge which characterizes modern and pseudo-postmodern philosophy alike.

Thus I said that Heidegger could be considered a father of a genuine postmodernism because he patently denies this idealism (even though many, in following him, unconsciously seem to adopt it).  His philosophy of Being-in-the-World, of Dasein, denies the initial presupposition of subjective-objective opposition upon which modernity was founded.  This includes the denial that the human being’s context-driven development of personhood is independent of (or should be independent of) the biological and the natural, which idea he criticizes implicitly in his lecture on Aristotle’s Physics, his lecture on the “Age of Ideology”, and at length in his Letter on Humanism.

peirce
C.S. Peirce (1839-1914)

Ironically, the more important father of postmodernity died before Heidegger had written a single word of those texts: namely, Charles Sanders Peirce.  The full story of how his semiotics transcends the subject-object divide with more clarity and success than Heidegger achieved is a lengthy one.  In short, his theory of signs shows the possibility of an essential continuity of the universe, from the most fundamental particles to the most abstract concepts.  As a result, he repudiates idealism: specifically, its nominalistic denial that the mind is capable of knowing extramental relations.

In contrast, what is most often today called postmodernism–authors such as Barthes, Deleuze, Lacan, Derrida, Foucault, Rorty, and so on–is in fact nothing other than ultramodernism.  It takes the Enlightenment division of nature and culture, and in consequence of the Darwinian destruction of the concept of fixed natures, runs unimpeded into nominalistic idealism.

Yes: it is absurd.  It is far removed from common experience.  But so is advanced mathematics, and quantum physics, and neuroscience.  The truth is, there are cracks in the foundations of the physical sciences as well.  The kind of scientism advocated for by the “New Sceptics” stands on the ground, as opposed to the cloud of abstraction in which we find Theory; were this scientism to look down, though, instead of screaming at the clouds, it would see that its ground is a melting ice floe.

Motes, planks, and eyeballs: you get the idea.

Tone Deaf and the Impotent Sadness

The End of Semester, End of Year Post that Indicates the Continued and Likely-to-Continue Awfulness of Everything

or

I Spilled My Thoughts on This Page and This Horror is the Result

By now, an awful lot has been written about the election and why it is that Trump won.  Much of it is horribly reactionary (like Trump himself), much of it inane, and much of it written in the very spirit that persuaded enough of the country (most of it, by landmass) to vote against the various ideals which Hillary represented.  Other pieces have been quite good and insightful, critiquing that very hyperbolic tendency which has been the chief failing of the regressive left.

None of this writing, of course, changes the fact that the United States of America elected a man who is a notorious liar, a braggart, someone who readily flip-flops on any position to his advantage, and who, in the course of this unprincipled behavior, may have shackled himself irrevocably to white ethnonationalists who share deeply disturbing commonalities with the German Nazi party.  I would not be surprised if Richard B. Spencer saw himself as in the mold of Adolf Hitler, and hoping to follow in his path to positions of leadership.

But this essay is not about the election, nor about white supremacists, or nationalism, or the deplorable state of our political system, its parties, or any such temporal and (it is to be hoped) changeable phenomena.  Rather, this essay is about habit.

Something about Perspective

Most of us, upon hearing the word “habit”, think of some mundane daily routine or action.  I have a habit of going to bed late.  I have a habit of scratching at my beard while I am thinking.  I habitually drink two cups of coffee in the morning.  Habits may range from the most trivial–cracking one’s knuckles, say–to the most important routines of a life: prayer, saying “I love you” to a spouse, charitable giving, or how one goes about working.

perspective2Like many of our English words, habit comes from Latin: proximately habitus, meaning a thing’s condition, and more remotely, habere, the verb meaning “to have, to hold”.  The Latin habitus was often used to translate the Greek ethos–connected to ēthos (character) and ethikē (ethics)–meaning something much more like our English habit, although narrower in focus.  Many of our habits begin and persist without conscious decision.  For the Greeks, especially Aristotle, habit was principally and primarily the result of consistent and deliberately-chosen action, to the point where it became automatic.  But much like the modern appropriation of the word ethos, habit also means having a kind of worldview: for the kind of person you are, the individually developed persona of the common human nature, determines the way in which you view the world.

So when I say that ours is a society of really bad habits, I do not mean that we chew our nails or smoke cigarettes.  Rather I mean that we have formed ourselves poorly in regards to how we both view and hold ourselves in relation to the world.

These poor habits appear as emergent from what I can only describe as an insipid attitude towards pleasure and pain: namely, that the former is the end goal of all our actions and the latter an evil to be avoided at nearly any cost.  But press the average person in conversation about what he or she means by the term “pleasure”, and you are likely to hear little more than some vague description of a “good feeling”; conversely, “pain” receives little more verbiage than “a bad feeling”.  Despite this amorphous understanding, we spend a great deal of time pursuing pleasure and striving to avoid pain, or its lesser cousin, discomfort.

I may be in the minority in this, but I think directing our lives by principles we do not really understand is a bad way to live.

Pleasure and Pain

What is pleasure?  Some may say a good feeling; more advanced sophists may say it is the result of dopamine’s activity in the brain.  The latter explanation actually tells us less of importance than does the former.  “Feeling” is a deeper and richer concept than a neurochemical reaction.  The feeling typically called pleasure undoubtedly incorporates such a reaction–but it is a part, and to take it for the whole is simply asinine.

Pleasure is, rather, the immediate subjective occurrence consequent to something experienced as good.

What is pain?  Some will likely say a bad feeling–or perhaps, the result of a “chemical imbalance”, the body’s notification of a wound, or the feeling of loss.  This lattermost is the best, as it comprises all the rest: for pain is the immediate subjective occurrence consequent to something experienced as bad.

Academic Pain

At a certain point in the hunt for an academic job, it becomes difficult, for many (like myself), to maintain the belief that you do not suck at what you do.  Since late 2014, I have been turned down or ignored for over 30 professor or instructor positions and fellowships.  In the past year, I looked outside academia, applying to roughly 35 companies for positions roughly suitable to my skillset and experience.  One of them offered me a job–sort of–but wanted me to start at an impossible date.  I recently put in 2 non-academic applications (they take so much less time) and have 5 academic openings on my “to-do” list.

Between both academic and non-academic, I have had fewer than 10 interviews.  Those 5 openings will sit on my list till the last minute, partially because they’re tedious, but also, I must admit, because I’m a bit afraid of another 5 rejections.  After all, maybe I suck.

At the same time, I am a published author, with 1 and likely 2 books soon under contract with reputable academic publishers, as well as a sprinkling of articles and recently a book review.  I wrote the majority of my highly regarded, well-praised, 187,000 word dissertation in less than a year.  Maybe I’m actually amazing.

The truth is likely somewhere in-between.  More importantly, even if I am amazing, I am also unwanted.  My dissertation was on a medieval thinker’s metaphysics and epistemology.  I’m currently working in phenomenology and semiotics.  I don’t play with the hot topics: feminism, ecology, social and political theory, bioethics, non-Western thought, or any other such.  They are not interesting topics.  They are movements.  I am fixated on the fixed, looking at the currents only insofar as they help us to see the regular patterns.

But still I wonder–am I maybe not that good at what I do?  So bad, in fact, that I cannot even be trusted to do competently things vaguely related to what I do?

The Slippery Slope to Self-Importance

For those of you unaware, academia currently produces roughly 400-500% more job-seeking PhDs than there are full-time jobs open every year.  We are the architects of our own demise.

“Don’t tell me the U.S.A. went down the drain because of Leftism, Knotheadism, apostasy, pornography, polarization, etcetera etcetera.  All those things may have happened, but what finally tore it was that things stopped working and nobody wanted to be a repairman.” – Walker Percy, Love in the Ruins.

Every academic I know hates grading.  It is a popular time for academics to get on Twitter (or write a blog post) and spend hours complaining about how much they hate grading, especially how long it takes.  A professor with whom I used to live would often remark that students have always known less than they ought, but that nowadays they know nothing.  They consistently provide ample proof of this, especially in their essays.

This semester, as usual, I tried to leave some of the best final essays for last.  It is nice when your grading can end on a good note.  To my disappointment, the final two fell below expectations.  This was actually true of nearly every essay I graded in the past week.  One exceeded what I anticipated, marginally, but the expectations were pretty low to begin with, so little cause for celebration there.

ernst-europeafterrainIn fact, looking back at the whole semester, the disappointment was consistent.  It is not my students’ fault–they have been poorly educated in the kind of thinking necessary to doing well in the humanities, likely, their whole lives–but it is certainly their problem.  And yet, if I grade them justly, I will undoubtedly suffer blowback, either directly from the department or indirectly from student evaluations and online reviews.

It is frustrating to me.  Here, I see the problems so clearly.  For instance, it is evident that if my students were ever taught grammar, it was long ago, and the lessons were not taught in a way that stuck.  I do not expect explicit knowledge of grammatical terms–say, a dangling modifier, or to be able to understand me if I say, “the appositive construction”–but it would be nice if they had, at least, the habit of not writing with ubiquitous and poorly-used comma splices.  But what can I do?

“Seeing problems clearly” may be the universal curse of the academic.  We all do really know things, and more often than not, this leads us to believe that we know a great deal more than we do.  I say to myself, quietly, in my head, all the time: “Ah, but you are a philosopher.  You are not like those other ‘experts’ out there, overly-specialized, attempting to reduce all things to the narrow paradigms of their own specialty.  You study wisdom!” and so on.  All true, really; but just because I maintain the openness innate to true philosophical inquiry and I study wisdom (whatever that means) does not mean that I have attained it, or that the solutions I envision to our complicated problems will actually produce results truly good.  I like to think they would.  But we will never know until someone puts me in charge.  So we will never know.

So even now, I sulk, like Achilles in his tent (Read a book!), frowning over the injustice of lesser academics having been awarded higher places.

Bad Professor Club

While in grad school (oh hey, I can say that now…), my fellow students and I referred to ourselves as the “Bad Grad Students Club”–largely because of an overpreening peer who insisted that unless you were reading a bare minimum of three scholarly articles every day, you were not really being a good student.  As most of us transitioned to being ABD adjuncts, we became a Bad Professors Club–mostly because we would decide to more-or-less wing our classes, rather than prep, so that we could go drink at the bar up the street, instead.  What can I say?  They had a terrific happy hour.

Nowadays, I find myself a bad professor of another sort.  For one thing, having relocated, I no longer have the familiar cadre of fellow delinquency and booze enthusiasts.  This has much improved the lot of my liver, I imagine.  For another, I formerly taught at a school that, though moving in the STEM direction, was still predominately a liberal arts institution.  Though most of the students I had in my classes were not majoring in the humanities, a strong-enough core curriculum meant that they were suffused with enough education and atmosphere of the humanities to attain (at least on occasion) a deeper-than-superficial grasp of the issues discussed.

Meanwhile, I spend my non-teaching-duty days consumed in the writing of a book likely to end on a handful of shelves and cracked open only half as many times, a book dealing with the sophisticated topics of semiotics and phenomenology, slipping through Latin and German technical terminology, exploring the  esoteric writings of a never-fully-repentant Nazi and a manic depressive (both of them with notorious adulterous affairs) as an antidote to both the materialistic neuroreductionistic and the angelistic deus ex machina explanations of human knowledge.

I am a bad professor.  Not because I do not know my material, not even because I am not interested in it; but I am a bad professor for these students.  They need a disciplinarian.  They need someone to break their bad habits, habits grown in an insular and truthless (not “post-truth”) world.  Instead, I ignore their bad habits, and continue to pass them along (albeit begrudgingly, and with a low grade), because I am convinced that some day, I can become the “successful” academic I have long awaited myself to be.

A bad habit.

The Intoxication of Mild Pleasure

I am no stranger to indulgence.  I have gone too far with alcohol, tobacco, even some of the gentler illegal substances–and more, besides.  But these pleasures are typically laden with a context that gives them purpose.  For a time, I drank not to feel; but mostly, I drank (and smoked, though that takes on a life of its own) to enjoy better the company of others.  Even sex can–and I think ought to be–a means for mutual betterment.

Where I lapse mostly into the pursuit of pleasure for pleasure’s sake is in those pleasures more mild.  I may swallow my fire, but I take my poison slow.  By this I mean: the feeling of a girl’s thigh under my hand, a marathon of mindless TV, an engrossing video game, hot showers, the incessant music by which I block out the noise of the world, and always the ability to control the objects to which I am exposed.

This, I think, is common.  I am no exception to the rule.  These mild pleasures, in themselves–really, they are comforts–do no grave ill to us, in moderation.  But we are drunk on them, all the time, and find the sobriety of discomfort horrifying.

We find a prominent example of this discomfort and its attendant horror in the vocal Left’s reaction to the Trump presidency.  No pleasure has yet been deprived yet the mere thought of its possibility is enough to discomfit to near the level of garment-rending and hair-tearing.  Infantile wailing and gnashing of teeth remains a perfected and oft-practiced art.

I think that so much can be made of the presidential election shows a deep problem: a nation of 300+ million cannot reasonably agree on the wide bevy of issues we have placed in the hands of Washington and a return to governmental decentralization is necessary.  But that is another issue altogether; indeed, interruptions to our comfort seldom take so dramatic a form.  Typically, they may be assuaged, smoothed over, and our comfort restored in a day, an hour, a minute, a few seconds.  Woke up on the wrong side of the bed; just get a good night’s sleep, and tomorrow usually improves.  Internet service is out; use your phone’s 4G service until restoration.  Noisy neighbors?  Use your headphones (okay, sometimes that doesn’t work).  Need a new coat?  You can get it for 50% less than anywhere else, from Amazord dot netweb, delivered to your door in two days–or less!

“But why not?” you interject, borderline indignant at the perceived slight, “Why, Mr. Dr. Prof. Krampus, should we not enjoy these things, should we not be comfortable?  Aren’t you writing this from your Macrohard Expanse III, while pumping Amazord Muzak into your ears to block out noisy neighbors and city business and such?”

Indeed I am–as stated, I am no exception.  For I am soft and fragile like the rest of you, wrapped in my cocoon of comforts; if I am in any way an exception, it is that I have taken to hanging mirrors on its interior walls.  Self-absorption does not mean becoming lost in some introspection; rather, it means becoming ultraselective about the boundaries put upon one’s own “world”.  I am self-absorbed, and the boundaries, the walls, of my world, are mortared by bad habit.

Locating the Source

Where do these bad habits come from?  Are we victims of circumstance, of nature?  Genes, family?  Upbringing, destiny?

There are certain unverified stories said to be so good that, if they aren’t true, ought to be.  One such story concerns the Catholic writer and apologist, G.K. Chesterton, an Englishman who wrote in the early part of the 20th century.  When asked by the Times of London, among other authors, to write an essay on the topic of “What’s wrong with the world?”, his response was:

Dear Sirs,

I am.

Respectfully yours,
G.K. Chesterton.

leda_swanPut less civilly, we could also quote an inexplicably insightful madman in Walker Percy’s Love in the Ruins: “You want to know your trouble?  You don’t love God, you love pussy.”

It is a philosophical claim the defense of which stands outside this article’s rambling ken, but the self and the source of one’s love are about as closely identifiable as any two elements in a human person can be.  What is my problem?  I don’t love God.  I love pussy.

God or Pussy?

First of all, “pussy” isn’t always sex.  It may be a video game.  It may be art.  It may be the cause of racial equality.  It may be the warmth of comfortable surroundings.  “Pussy” is that which distracts us, diverts us from a fuller, better way of being human.  “Pussy” is that out of which we build our cocoons.  In contrast, “God” isn’t always God, though one might argue that it ought to be.  God is, to paraphrase Christ, the Light, the Truth, and the Way.

How are we to walk that Way?  How can we see by that Light?  How might we grasp that Truth?  The first step, I think, is to muster the courage to poke a hole ourselves in the fragile walls of our cocoons and make ourselves look out.

And keep looking.

Why Everyone Needs to Calm Down

First, I’m going to call shame.  I saw innumerable tweets last night calling over 59 million Americans racist, homophobic, xenophobic, sexist, horrible deplorables and so on.  I’m sure there are many Trump voters who are those things.  But I am also certain that there are many more who are not.  Some of my own family voted for Trump, and they are certainly not any of those things.

If anything, those harsh accusations help explain why Trump won.  Middle class jobs are and have been disappearing for years, and that trend seems likely to continue.  For people without a college (or higher) education, this means a path to nowhere.  Low-income jobs typically do not result in any career advancement or significant raises.  As college becomes more expensive (and grad school even more so, not only financially, but chronologically), this turns into a vicious cycle from generation to generation.  It is a problem which has, historically, disproportionately affected minorities–but it has also affected whites as well, and whites (non-Latino/Hispanic) make up the majority of the population.  80% of 13.2% (African American population) is less than 20% of 62.6% (white population).

Add to this that many white people who do succeed are told that they succeed primarily because they are white, and it’s not hard to see why they might become resentful.  They’re told that they need to support and accept people who are different, less privileged, who come from cultures they don’t understand, against whom they do have some reason to be suspicious–not a good one, but still a reason–and that if they don’t put their needs after the needs of foreigners, they’re racist and xenophobic bigots.

I’m not saying they’re right.  I’m not saying that they have the proper perspective.  But, my God, for all those who preach tolerance and universal acceptance, you might want to start in your own backyard.  Are they ignorant regarding other cultures?  Definitely.  Are you ignorant of theirs?  I think this election shows that you quite probably are.  Flyover country–and that includes rural Pennsylvania, Ohio, Michigan, and Wisconsin–has long felt ignored, neglected, and abused.  To be honest, I don’t really blame them.

Second, I’m going to exhort you to take a very deep breath.  Donald Trump may soon be the most powerful man in the world; but he’s still just one man.  It is regrettable that the power of the president has been exponentially increased in the past 16 years–for which we can blame both Bush and Obama–but it is still the power of one man.

Additionally, that man is a notorious liar.  Bush promised smaller government for everyone.  It mushroomed into a revolting behemoth during his 8 years.  Obama promised affordable healthcare for everyone and a withdrawal of military interventions, while many people’s rates went up 100% or more and he ordered 10 times as many drone strikes as Bush.

Trump made outlandish, vague, poorly-defined campaign promises.  Do we really think he’s going to keep any of them–when he has switched political parties 5 times since the late 1980’s?  He was for abortion before he was against abortion.  He donated to Hillary Clinton’s senate campaign.

So read his acceptance speech and tell me if you think he’s going to stay hard-line with his promises, or if he, like nearly every other successful politician in the history of the world, made his promises to win votes.  The cynic in me–which is all of me–thinks that we’re really just in store for more political words coming up empty.  And today, that’s a good thing.

I could be wrong.  Let’s hope I’m not.