More on the “West:” The “Alt-West” and Islam

I have written before about the vagueness and unhelpfulness of the concept of “the west,” or “western culture.” Today I read a comment on some social media post by someone claiming to be part of the “alt-west;” which, according to this commenter, is a project about “saving Western culture.” You may rightly ask: saving “the west” from what? Well, the “threats” of “Islam and Marxism and of course that ever present danger of “multiculturalism.” The next question one may ask is what these “threats” will do to the “west?” The commenter explains: “You can’t have anything close to a Free Market if Islam is pervasive in a culture. They shut down and destroy almost as much as Communism.”

This is an extreme claim, which many of the ideology of this commenter would deny is in any way bigoted or a gross misrepresentation of Islam. However, this claim is bigoted and a complete and severe misrepresentation. First of all, no religion, not Islam, not Judaism, not Christianity, not Hinduism, not Buddhism, NO RELIGION shuts down and destroys things, especially not a culture. The fear that Islam will become a “pervasive” religion and “destroy” “western culture,” or “a Free Market” is utterly ridiculous, completely intolerant, and deeply, deeply, deeply incorrect. No religion, no ideology has causal force. People, who may or may not adhere to a particular religion and/or ideology, shut down and destroy things. Those people may be communists, or Muslims, or socialists, or Christians, or part of the “alt-west,” but it is not the ideology that is destroying things, it is the people; possibly motivated by their ideology(ies), but possibly not. Worry that some mystical force inherent to Islam, or communism, or multiculturalism, will destroy “Western culture,” is incorrect to the point of ignorance.

This arrogant “Western” savior’s display of ignorance doesn’t stop at claiming that Islam will destroy “Western culture,” he claims that Islam already has! Here is what he claims: “Islam caused the Dark Ages in Europe because it destroyed so many trade routes through piracy and attack that it shut down massive amounts of commerce across Europe.” That’s right! It wasn’t Visigoths, or ineffective rule, or civil war, or religious controversy surrounding Christianity, which precipitated the collapse of the Roman Empire, but Islam destroying trade routes! Someone better tell the classicists and historians that they’ve been wrong all along, because the “alt-west” has set the record straight. If by set the record straight, one means completely misunderstand and misrepresent history to fit their misinformed, ahistorical, and reductionist worldview.  It may occur to readers that I am not being kind and charitable either in my reading of this person’s commentary or in my refutation of his ideas. Certainly, I am not.

It is true that charity, understanding, and kindness in reading and refutation of ideas, and in all aspects of life, are virtues. However, there are times that it is impossible to read charitably or refute kindly. Charitable reading and kind refutation require the material to have some depth, some level of informed argumentation, even if this depth is odious or the information informing the argumentation is despicable, it must be present. But a comment like this contains no real depth or information at all; of course, social media comments cannot be expected to contain any such depth or information, though of course it may be nice if they weren’t completely misinformed. Undoubtedly, however, this commenter doesn’t seem much to care about charitable readings or kind refutation; responding to a challenger by calling him a “cultural suicide advocate,” whatever that means.

There are no “threats” to “western culture,” because there is no “western culture.” People change, cultures change, as Heraclitus would say “everything changes.” If Islam, communism, or multiculturalism becomes predominate in the “West,” it will not “destroy” the “West,” it will change the “West.” Change is not destruction; surely, one may dislike these changes, may even push back against these changes; however, to push back against these changes one must understand the changes, must understand the factors contributing to these changes, must understand that pushing back is not “salvation” of some threatened stagnant culture. If these people in the “alt-west” believe they are engaged in “saving Western culture,” they are deluded. Let us all remember some of the best advice Spinoza ever gave: “Do not weep. Do not wax indignant. Understand.”

If you wish to claim that there are issues within the ideological system of Islam, communism, or multiculturalism, immerse yourself in the literature of these fields, become an expert and push back; from afar it is easy to see superficial problems and make superficial (often ignorant) claim, but from within one can see real problems and make real (charitable and understanding) claims. Though someone arrogant and ignorant enough to claim that Islam caused the fall of Rome seems content with superficial problems and superficial claims about ideologies of which they have no actual understanding.

Advertisements

A Brief Critique of Laissez-Faire Capitalism

Introduction

For a long time I have been a supporter of an economic system that is usually called “laissez-faire capitalism,” or the “free market,” or “freed markets,” or really any number of terms. During much of this time I was unabashedly supportive of this system and saw criticisms of the market as wrong-headed and, usually, simply ploys by lovers of big government. However, recently, I have come to realize that criticism does not imply total rejection, that one can critique a system while still believing in it generally; furthermore, I have come to realize that there are some legitimate criticisms of the free market system that can and should be addressed. I believe that the issues raised in this all too brief critique of the free market system can and must be solved in peaceful, non-governmental ways; though many believe that the only alternative to free markets is planned markets, and, thus, that any criticism of the free market must be in support of planned markets, this is not the case. The solution to the problems of the market is not a rejection of the market, but a betterment of the market. This brief critique will cover only two things: (a) the problems with the idea of “self-ownership,” and (b) the moral effects of the market. It must be stated at the outset that this critique is of my own personal beliefs as they were for some time, this is not a critique of any thinker or school of thought; if there is relevant reading that you think I ought to be aware of, please leave a comment.

The Problems with “Self-Ownership”

One of the fundamental politico-philosophical assumptions of most of American politics, especially on the libertarian-right, is the assumption that individuals “own themselves.” At first this seems to be an eminently reasonable position positing complete control and autonomy over the self; however, though the principle of personal autonomy is a good and salvable principle, the language of self-ownership is deeply flawed. The language of self-ownership leads to dangerous and unpleasant conclusions is manipulated correctly; the best way to avoid this manipulation is not to continue using and refining the language of self-ownership, but to use a better and more clear language when talking about personal autonomy.

The fundamental flaw with the language of self-ownership is that it speaks of “ownership” in and of “selves.” Ownership in all cases implies right of property, one does not speak of ownership in non-propertied entities; in all cases, ownership entails property, even in the metaphors of “owning one’s mistakes,” and “owning up,” the metaphorical referent is to a property-control scenario in mistakes. Thus, “self-ownership” implies property in and of the self; the holding of the self as property. However, selves are not property and must not be thought of as property. If there is property in the self, then it is easy to get to a position in support of a kind of slavery.

If one owns oneself as they might own land, it is clear that one can trade oneself to someone else, as one trades land properties. Thus, the very idea so often put forward in opposition to slavery, i.e. since you own yourself, no one else can own you; actually, leads, at least in language, to an endorsement of slavery, albeit “voluntary” slavery. However, and this may be one of the hardest things an “anarcho-capitalist,” or libertarian to accept, the mere fact that something is voluntary does not make it morally correct. It might, in an ideal legal system, make something legal, but what is legal and what is moral is not the same thing. It may be legal to voluntarily end a person’s life in certain circumstances; however, it is not moral to do so. Similarly, voluntary slavery, though it may in the idealized legal system of the “anarcho-capitalist,” be legal, it will never be moral. Personal autonomy exists outside of the ownership in the self. One is autonomous even if one is unowned, and since persons are unownable, autonomy and responsibility are all that is left for the individual. We must put the language of “self-ownership” to rest and in its place speak of self-autonomy and self-responsibility.

The Moral Effects of the Market

I hinted in the last section to the fact that the pure free market system as many believe it holds some questionable moral precepts, such as the voluntary nature of an action speaking to the morality of an action. Clearly, it is true that coercion and force necessitate immorality; however, merely because voluntary actions are not forced does not make them any more moral. The morality of an action is not determined by the forced or unforced nature of the action. Slavery is immoral period, not dot-dot-dot only when its “forcible slavery.” Similarly, just because something is produced and sold on the market, does not make that thing good or moral or something people ought to want. Indeed, this is one of the biggest problems with free market ideologies, in assuming the goodness of the market there is a failure to question what ends people ought to be achieving and what means they ought to be using. Obviously, there is no clear cut answer to this question; everyone must, more or less, choose their moral system and this moral system should instruct them on the ought. However, the amoral nature of the market seems often forgotten by its defenders.

Often defenders of the free market seem to fall into a pattern of thinking that holds that if consumers want something and the market produces it, then it must be a good thing; since people’s actions on the market are nominally rational actions that indicate wants and desires to produces and since it is out of place to question what the consumers want and what the market produces, one concludes that the morally good will out in the market. However, this is clearly not the case, look at the United States, which though not a perfect free market, has a flourishing system of markets, consumers want things and producers make things that cannot be considered morally good. Furthermore, the assumption that consumers will indicate to producers what they want for the price they want on the market system, does not seem perfectly sound. How many times has a person thought to themselves’ “this price is outrageous,” but still bought the product anyways, thus indicating to the producer that the price is right? And how many times have producers made and marketed products that no one has thought of wanting only to later be convinced that they must have it? People are imperfect and, thus, any human system is also imperfect including the market.

To make the free market a better system we must engage with moral questions concerning the market. We cannot assume that the market will produce morality without input from other institutions. The market is amoral and can go in any direction, it can lead to a building up of the moral fiber or decay, the choice is up to us. We must have moral systems in place to keep the innovations of the market in check; we must have moral systems and institutions that help people (not through force but through persuasion) figure out what they ought to want and ought to do. Only with proper checks can the market lead to a flourishing society, without them it will become progress towards the abyss.

Conclusion

This critique has been brief and there is much more to say in support and in criticism of the free market. I want to reiterate that critique is not rejection; that the solution to these criticisms is not a wholesale abandonment of the market, but simple, non-governmental solutions. We must change how we talk of selves: we must not talk of ownership, and thus property, in and of selves, but about autonomy and responsibility. We must not assume that the market will produce or produces morality; an amoral system can progress in either way. We must have systems and institutions in places that aid in the creation of morality and help individuals decide what they ought to do, buy, and make. There are two ways to progress, both on the market and in society at large, progress towards the good or progress towards the abyss; the choice is up to us; though, it is this latter course I fear that we are already on, maybe irretrievable.

Please leave questions or comments below; especially if you have suggested reading for me! Thank you and have a blessed day!

“The Right Side of History”

In modern American political discourse there is much that doesn’t make sense when analyzed. It all sounds convincing, but if seriously considered it lacks any real depth or breadth. The phrase “on the wrong side of history,” and its opposite “on the right side of history,” are perfect examples of this phenomenon. It sounds highly convincing to cast one’s social-political views and policies as the “right side of history,” and those with the opposite opinions as on the “wrong side of history.” Yet, what does this mean? Usually it is used as a moral statement, i.e. said policy is right morally, but as a moral statement there is no need to reference history; the more sensible statement would be “on the right side of morality,” which carries much more depth and breadth, if backed by principled moral convictions. However, “on the right side of history,” explicitly mentions “history,” leading one into dubious and unsafe territory.

If “on the right side of history” is a moral statement, then a claim about history is being made, viz. that what is morally right will win out historically. This idea is dubious to say the least. What reason do we have to believe that what is morally correct will win out; that what we believe right will be on the winning side of history? I would posit that we have no reason to see the world this way, in fact, I would be more comfortable (but only little) to accept the converse, viz. that morally rightness will lose. However, I am comfortable with neither the positive or negative view, for it seems more sensible to hold that history has no system, no agency, and no bend.

History is a story of ideas and the people that had and used those ideas, it is not a real thing that itself acts or tends to anything. To understand history as a narrative is to reject all interpretations of history that hold historical change as the chief principle, e.g. Hegelianism (“Historical change, seen abstractly, has long been understood generally as involving a process toward the better, the more perfect” 1953, p. 68.; “Historical development, therefore, is not the harmless and unopposed simple growth of organic life but hard, unwilling labor against itself” 1953, p.69.) or the Whig interpretation of history (Butterfield, 1931). History as a narrative is to be understood, as with any narrative, not as a thing with its own agency but as the neutral story of ideas and actions. One would not say that a protagonist is “on the right side of the novel,” because such a statement is ludicrous since it attributes the correctness of action not to the protagonist or author but to the neutral explanation of actions. To give agency to history or to a novel is to sap agency for the lived actors, either real or fictional. Thus, when one says that they are “on the right side of history,” they are removing agency from themselves. Their morality is not real morality but only morality preordained by historical change. They have no agency for their actions but are a purely mechanistic instrument for the unchallengeable process of history.

In my opinion, if something is moral it is moral regardless of institutional or societal outcomes (later to be called history); had the Axis powers won WWII their actions would not have become moral right, simply because they were “on the right side” of their historical narrative. Therein lies the rub, history as a narrative leads to the unsettling consideration of who writes the narrative. No two people see the same thing the same way; thus, each narrative will be different. Evidence and facts, the two champions of thinking in the scientific age, are not, in fact, either stagnant or universally the same. Two people may well look at the same evidence and, influenced by their own opinions, draw two different conclusions. Facts are no sure method at arriving at the “truth,” a thing which doubtless exists, but due to human’s remarkably limited sight and ability will never be fully obtained in this world. Facts and evidence are belief-dependent, thus the narratives built upon them are as well. There is no right or wrong side of history, only different narratives. History then cannot be justification for morality or policy.

What is Western Culture?

There is a lot of talk in some quarters, particularly on the “right”, of reclaiming or preserving “western” culture.  I am unsure what they mean by this, what is “western culture” What are these, predominantly white and American, individuals interested in “reclaiming” or “preserving.” It is tempting to write them off simply as white supremacists, as assuredly some of them are, thus avoiding the whole question of their meaning. However, it is far more interesting to asked, “what do they mean, what do we mean, by western culture?” What is western culture? Do some groups count as “western” while others count as “non-western?” What about “eastern culture?” What is it to be “western” or “eastern,” or “non-western?” Are they actual, and helpful cultural categories, or do they simply misled and confuse.

If by western they mean anything geographical, they are left with the task of telling us what (geographical) western culture is. What do we all share here in the western hemisphere. It isn’t language, the majority of this hemisphere speaks Spanish, while others speak English, and still others speak a variety of different languages. Is it cultural practice? This can’t be correct either, since American cultural practice is not similar in most significant ways to, say, Brazilian cultural practices. This is true superficially in food and social customs as well as on a deeper level, of for example, time perception and attitudes about sexuality and relationships. Of course, there is serious doubt that they mean anything geographical since many of those involved in “saving western culture,” are opposed to widespread immigration into the United States by individuals from Latin America, which is definitely in the west (geographically understood).

Perhaps, they mean a western cultural heritage, steaming from Europe, or more specifically, Western Europe. Yet, this too is vague and unhelpful as a definition of “western culture.” Many of these same individuals despise things like French postmodernism; yet, France is assuredly Western in the sense of Western European. Moreover, immigration is comes back into this; how do individuals from Latin America no share in western (European) cultural heritage, is Spain not Western European? Are things like Spanish imperialism, Spanish scholasticism, or Spanish surrealism not part of the “western” cultural heritage? Here we cut to a deeper problem of meaning Western European heritage in saying “western culture;” what is and isn’t a part of this cultural heritage? Empiricism, rationalism, realism, irrationalism, surrealism, analytic philosophy, imperialism, fascism, socialism, communism, capitalism, romanticism, and existentialism; to list but a few things, are all a part of the “Western (European)” tradition. The tradition is not unitary; indeed many of its elements are self-contradictory. What do they want to preserve? It cannot be all of the Western European tradition because there is no single “Western European tradition.” Perhaps, “western” has some religious meaning.

Yet, this hardly makes any more sense. If by “western” one means “Christian” as opposed to Islamic or Jewish, one is faced with irresolvable issues. How can Christianity, which shares a center of origin with the other Abrahamic religions, be “western,” while the other two are “non-western?” Furthermore, which Christianity is “western:” Catholicism, Protestantism, modern American evangelicalism, Eastern Orthodoxy, all of them? If only one type of Christianity is “western,” where was the break? For that matter, if Christianity is tasked as “western,” while Judaism is “non-western,” where is the break there? Jesus was Jewish, by both birth and religion; as was St. Paul. Thus the “western/non-western” break can be in neither of them. Was it in the great schism, the Roman Catholics becoming “western” and the Orthodox becoming or remaining “non-western;” but, if this is the case, how does the “western” become the “non-western,” and vice-versa? Pulling back for the narrowly Christian how would Judaism or Islam become “non-western,” given that they inhabit similar, indeed in many cases the same, locations as Christianity, both geographically and philosophically. Indeed, it was faithful Muslims that became some of the most important neo-Platonic and Aristotelian philosophers of not only their own era but of any era. Jewish scientist in the west have contributed some of the most important advancements of the modern age, the paradigmatic scientific genius, Albert Einstein, was of Jewish origin. Are Al-Ghazali and Einstein “western,” or because of their religious background are they “non-western?” Perhaps, the atheists have an easier time here; but, no, for if all Christians are also considered “non-western,” then there is no European “western” tradition; even the great secularizers of European society come from the possibly “non-western” Christian milieu. Thus, western certainly cannot be religiously defined, what is left?

Is it ethnicity? Do those interested in “preserving western culture” really mean Anglo-Saxon, Protestant culture? If so, they are being both ridiculously vague and downright foolish. As there is no unitary European, or Christian culture, there is no unitary Anglo-Saxon and Protestant culture. Even if there was, and this is what they want to preserve; they should say that and not “western;” England and the English colonists that founded the United States, are not the “west!” Indeed, many of those that believe in preserving “western culture” confirm that is not Anglo-Saxon, Protestant “culture” they are interested in preserving, for they decry the state of modern France, which is certainly not Anglo-Saxon or Protestant. Thus, it is not ethnicity, religion, heritage, or geography that is meant by “western,” what else is there?

There is race. Do they mean “white” when they say “western?” It certainly seems that way since they exclude both people from Latin America and Muslims from being “western,” when there is a strong case to be made that both are, in fact, “western.” Many will chafe at this accusation of racism, but there really are only three possibilities: (a) those interested in “preserving western culture” haven’t given much thought to what “western culture” means, (b) they have given it serious thought and have decided that the logical problems with each definition (geographical, heritage, religion, or ethnicity) are either false or solvable; or (c) they are deliberately vague about what “western culture” is, because they don’t want to come out and say that they mean “being white.” By the way, this racial definition doesn’t really work either, as there is no single “white” culture. This is the real lesson here, no matter how one tries to define “western culture” they are stuck in bad definitions that try to create unified cultures where there are none. There is no single Western culture (there also isn’t a single “Eastern culture”), there are many fluid, constantly changing cultures in a region called the “west,” but their region is not their definition. Let’s end with a little thought experiment:

There are four neighbors that living in an average town in the United States, they are described as follows:

  1. A naturalized American citizen originally from China; is an atheist and holds an M.D. from John Hopkins, she despise all forms of ‘medicine’ that are not backed by hard science.
  2. An American born white woman; practices “eastern medicine,” is a strict vegan, does yoga, and is a feminist blogger and video maker.
  3. An African-American couple consisting of a Baptist preacher and a jazz musician, they have two children, one at the police academy and the other in college to become a lawyer.
  4. A recent immigrant from Mexico, devout Catholic that goes to church every Sunday and volunteers in the church’s youth minister, he also has an American flag proudly flying above his house.

Please answer this question: which person is the most “western?”

On Loving the Enemy

In the wake of tragedy, in the wake of hateful actions, it is easy to turn to anger and thoughts of violent retaliation. It is easy to say that ‘love won’t win this battle’ as many have said. It is easy to fall prey to that human, all too human desire to enact justice through strength. It is easy to think that now, uniquely, is the time to use force against hatred – easier still to draw simple comparisons between the current day and a past era when force seemed to work to bring about justice. Finally, it is easy to hate those that committed the crimes, the injustices, and the hate. But what is easy is not always the correct course of action. The heat of the moment quickens the emotions but misleads them, we must stop and feel; we must stop and think.

In the wake of recent events, there are calls for an aggressive reaction and the aggressive is always the hateful; one does not ‘aggressively’ fight cancer out of love for the disease but out of hate for it. It is striking to find comments “reminding” people that “love did not defeat Hitler;” thus, we are told, “love” won’t defeat this fresh threat. Yet, here is the trap of the easy; it is easy to think that loving the enemy is inaction, easy to think that love is passivity, easy to think that only hatred and aggression are active; easy but false. To misunderstand loving the enemy as passive acceptance is to misunderstand the purpose and method of love.

To love the person is not to love their misdeeds; in fact, loving makes the hatred of misdeeds all the stronger. To love the person is not to overlook their actions but to examine their actions, understand their motives, and empathize with their emotions; all the while despising their hateful actions. This may seem a bit paradoxical, for how can one empathize while simultaneously despising? In the same way that one loves the person while simultaneously hating their actions. To give a concrete example, one must love the murderer, understand their motives, and empathize with those factors (moral, psychological, environmental, and social) that contributed to their choice of action, but maintain a hatred of the action. To love the person is to hold the person in the holistic view humanity demands; this holism of love is one of the reasons that it is so hard and why it is much easier to hate the person and hate their misdeeds.

In holding everyone in the holistic view, in seeing them as whole people, as complex products of ever more complex situations, there is an uncomfortable necessity. A necessity to examine those complex factors that contributed to the hateful action; to consider the moral and social environments giving rise to such thoughts and deeds, to examine the psychological underpins that may have played a part, in short, to look at the variety of causes that resulted in the hateful action. Rather like the chemist may examine those chemicals that played a part in a violent reaction; we must examine those factors that lead to hate and hateful actions. This is deeply uncomfortable for in examination of these factors, one might find unsettling conditions that one has been complicit either in maintaining, supporting, or writing off as just part of the system. This is not to blame any individual or group of individuals; indeed, perhaps, most unsettlingly, everyone is to blame, because everyone is, in at least some small way, complicit in some hateful action or another. For hate and transgression are spiraling things; one action of hate leads to another, abyssus abyssum invocat [1].

For this reason alone, one is compelled to love one’s enemy; perhaps, in loving one’s enemy one can quell the spiral of hatred; in dwelling in the light of love one might be able to drive out the darkness of hatred. Yet, there is not only this reason to love one’s enemy. For loving one’s enemy is not only, not even primarily, about this fleshy experience termed human life. It is not a tactic to win battles; it is not a banner for the new revolution; loving one’s enemy is, in fact, about souls. Perhaps, this is why the idea is so sneered at today, in the increasingly secular world that rejects the “silly” notion of the soul (a different discussion for a different time). Maybe there is something here, maybe talking of souls is too grand, call it the heart or the mind or whatever else one wishes; the principle remains the same. Hatred degrades and ultimately destroys the soul. This is why, hatred spirals, the degraded soul seeks to degrade, and in degrading is yet more degraded. It might be said that the task of loving one’s enemy is as much about oneself as about one’s enemy. Indeed, hateful actions are designed to generate hate, thus in responding with love the hateful act is sapped of some of its power. However, there is a mistaken interpretation of this that must not be made.

Loving one’s enemy is not excusing one’s enemy. Loving one’s enemy is not always pacifistic appeasement. At times, a violent response to violence is justified, perhaps necessary (though this is a thorny claim); actions have logical consequences. However, rather like the good parent, who in punishing their rebellious children does not cease loving them, one must not, in violently responding to violent action, stop loving their enemy. We must love but we must condemn; we must understand but we must never excuse. We must neither stop loving and understanding our fellow humans, yet we must, in no uncertain terms, denounce injustice and hatred. To do either is to do precisely what we decry. The former is to hate the criminal; the latter is to hate the victim. We must do neither. Ἒνθεν μὲν Σκύλλη ἑτέρωθι δὲ δῖα Χάρυβδις [2]. Here, again, is a reason that the task of love is so difficult. Compounding this is that love must constrain our actions; we must, in accordance with love, only ever use defensive violence, and never aggressive violence; for, again, aggression necessitates hatred of those aggressed against. To beat this path is hard, at times painfully hard, for it is natural to want to enact harsh punishments against unjust, but it is necessary to beat this path, we can do no other.

In the wake of injustice there are easy choices and there are good choices. The choice to hate those that enact injustice, ultimately, only leads to more hatred. Degradation leads to degradation. To love the person is not to love their actions, but to hate their actions. To comprehend the origins of the hatred is not to excuse the hatred; to love our enemy is not to spite their victims; indeed, loving our enemy is the same as loving their victims. To love is to oppose hatred and in opposing to turn the tides. To love is to understand the whole human and, in understanding, never to excuse transgression but evermore to despise it. To love is never passive, but always active. To love is never to refuse to punish but to limit our harshness, to avoid aggression. In all this, we mustn’t fall prey to false self-righteousness that in loving we are better than those who hate. We are all human, yoked together whether we like it or not [3]. To be self-righteous is to fail to see that we are all damaged, this is another reason that hatred comes so easily in response to injustice. In hating we are allowed to feel that those enactors of injustice are somehow separate from us; but, disturbingly, evil actions remind us that within humanity there is a capability to do both good and evil; within this thing called human life there are options to hate or to love. To hate and divide is easy; to love and unite is hard. Indeed, as Plato wrote: “χαλεπὰ τὰ καλά.” [4].

Endnotes:

[1]: One misstep leads to another; or literally: hell calls to hell

[2]: On the one side, Scylla, on the other divine Charybdis (Homer); ‘between a rock and a hard place’

[3]: For those hardcore individualists tempted to deny that all people are inexorably linked to each other, may I say that such a position ignores human reality. It is one thing to advocate methodological individualism for analysis (indeed this is a useful method) and/or to advocate for individual agency and autonomy over and against collective authority; but it is an entirely different thing to deny that everyone is bound up together as fellow human beings and that the actions and words of one person affect another, and that this effect has a chain reaction. If one is tempt to quote that famous line of Genesis 4:9: “Am I my brother’s keeper;” one may well wish to recall that this is said by a man that has freshly murdered his brother, thus, clearly, God’s answer (if one is so inclined to belief), is that yes, you are your brother’s keeper.

[4] “The fine (or good) things are difficult.” [Republic; Hippias Major]

The Problem of Opinions

Stating one’s opinion on any subject, from the most mundane to the most profound issues, is a risky business; whether in speaking or writing any method of putting forth one’s thoughts into the world involves taking deep and grave risks. Beyond the obvious danger of finding oneself in deep disagreement with one’s fellows, be they colleagues, friends, relatives, lovers, or mere fellow interlocutors; there is a graver risk. For while disagreement can sometimes lead to unpleasantness, if handled correctly it can also lead to mutual learning, understanding, and interesting discussion; whereas, the graver risk of stating opinions has no real potential for benefit, at least at first brush.

The danger of stating opinions is that once stated there are two option one has, either to become rooted in this position, or one day to admit one was incorrect and state one’s new opinion. The first option leads on into a deep and unsettling intellectual position of either refusing to accept new information an arguments that go against one’s previously stated opinion (stagnation of opinion, obstinateness) or performing twists in thinking to make new arguments fit old positions (mental gymnastics). In short, this is intellectual dishonesty and a refusal of growth. Let me be clear, it is perfectly acceptable and intellectually honest to have strong opinions that one defends in the face of all new arguments; however, this is only commendable to the point that well principle stand firm there is still change. Having firm principles is honest and commendably, being an obstinate dogmatist that refuses to engage with other, different arguments is neither commendable or decent, intellectual behavior. Again, to be clear, I want to point out that there will be times that even the best will fail to not slip into heatedness, unnecessary fervor, and/or inflective dogmatism; however, I cannot stress enough, it is out of these failures that we must arise, do better, and be better, though we will fail time and time again, each failure must serve as a reminder to do and be better.

The other path in this option is no less odious and no less common. It is often called “mental gymnastics,” a term which though tending to be used negatively, gives a fairly accurate idea of what goes on. A new argument presents itself, one that would seem to require a change in opinion, but instead one just works around it, in a dishonest way. Often this takes the form of accepting premises but reforming conclusion by sneaking in new premises. This is dishonest; the honest answer to new arguments is either to find a reasonable challenge and critique of them, or let the new arguments shape one’s opinion. I want to be clear, one should not change their opinion based off the last argument they have heard, this is as dishonest as dogmatism; however, one must open their beliefs to round criticism and robust counterarguments, not necessarily accepting or rejecting criticisms; but countering with reformed, better-honed, more robust arguments. In creating more robust arguments one’s opinions necessarily change, if only slightly, for it is impossible to robustly respond to counterarguments from a place of mere dogmatism and poorly thought out principles. This is why every ‘school of thought,’ in any field, is a place to start, never a place to end.

In all this the second option for action after stating an opinion has shown through, namely to admit one was incorrect and state one’s new opinion. This is difficult and rare, for it is much easier, much more comfortable to remain stagnate, to stop at the point of first thinking and never push forward. At least in that case one runs no risk of people finding old statements of opinions and taking that as current statements of opinions. This is a real danger, especially of stating opinions in the public form; however, this should not prevent one from either stating one’s opinions publicly or changing one’s opinions publicly. For as Cicero wrote: “if we are not ashamed to think it, we should not be ashamed to say it.” Furthermore, any honest person will admit that continued thinking about any subject will often lead to some changes in opinions, slight to major; there is one simple illustration of this: since all thinking on subjects is essentially a conversation (cf. Richard Rorty; this is why, for example, the Platonic dialogues are dialogues), it is understandable that as one hears more voices in the conversation, one’s opinions will change; it is also understandable that one has not, at any one time, heard all the voices that have spoken, are speaking, on a subject. For example, if opinions were formed purely from reading, it would be nearly impossible to never be encouraging new voices with new arguments, given that millions of books are published, have been published, since the advent of printing. Thus, though it is dangerous to share one’s opinions at one time, it is worthwhile; it is also worthwhile, in fact, perhaps noble in some cases, to publicly changes one’s opinions based on new arguments, so long as one is always changing their opinions (this is empty-mindedness, not thinking). It is difficult to place oneself in this uncomfortable position, but as Spinoza says at the end of Ethics: “Everything excellent is as difficult as it is rare.”

What I Write When I Should Be Writing

Writing and reading are so seemingly essential to modern life, especially modern academic life. This might seem odd to people that believe that the advance of technology would destroy the written word. However, the internet has not devalued the written word but made it more powerful and omnipresent. Now, the majority of the American populace is literally surrounded at all times by more written information than ever before in human history. Furthermore, though places like Amazon may have “killed” (a ghastly metaphor) the brick-and-mortar bookshop (though not in all cases); they have not “killed” the print book. Indeed, this seems odd given the advance of devices and apps such as the Kindle, Google Books, and iBooks; for some, ebooks truly rule the day, but for many people physical books are still very much desired and used. Positing why this is the case can only be pure conjecture, utterly tainted by personal preference. However, I believe that the there is something in the physicality of tangible books that make them appealing to many people; there is also the aspect of visibility. It is impossible in most cases to tell from a passing glance what someone is doing on their smartphone, tablet, or laptop. Maybe they are reading an ebook, or maybe they are scanning Facebook; it’s a mystery. Whereas with a physical book there is no mystery, it is easy to tell at a glance when someone is reading a physical book. Moreover, I would hazard a guess, that a third aspect contributing to the continued presence of print books is a desire to differentiate activities.

There is a serious danger when one is reading on a screen to flit to something else, a game, a social media app, etc. This can seriously damage one’s reading; indeed, only the dedicated reader will not yield to temptation when a boring section of a book occurs. Clearly, there is a danger to flit between tasks with a physical book as well. It is all too easy to put a book down during that dull section. However, I would posit that there is a fundamental difference in these two types of task switching. Fliting between apps on a smartphone is as simple as putting down a book, but far less physical and, thus, (I would guess) far less memorable. Whereas after putting down a physical book it is still there in what can sometimes be oppressive physical omnipresence, reminding one of the task they’ve abandoned; switching apps on a smartphone is easily forgotten, it is easy from minute to minute to go from iBooks to Facebook to Twitter and on and on, without returning to iBooks. Closing out the app removes the presence of the book as does shutting down one’s Kindle or other e-reader. The book is in some sense gone, vanished, not, as with physical books, oppressively omnipresent. Giving up reading Gravity’s Rainbow on a reading app is much simpler than abandoning it physically. The file takes up no physical space, it does not stare one in the face every time they pass their bookshelf; in short, electronic books are more easily forgotten than physical books. Furthermore, I believe that many people probably want to differentiate their tasks, between “screen-time” and “non-screen-time;” especially with the growing body of evidence that screens are changing our brains [1]. I think it is safe to say that print books are going nowhere anytime soon. However, this does not get us any closer to why reading and writing are so fundamental.

It seems odd that something so artificial has shaped the modern world in more ways that it is really possible to fully comprehend. It is hard for any literate person to imagine a world without writing; I do not mean that it is hard to imagine what it is like to be illiterate. This is something, I think many can easily imagine, though I doubt many can understand the deeply unsettling emotionality of adult illiteracy. However, I think that to imagine being illiterate, or actually to be illiterate, presumes literacy. The lack assumes to the presence. To imagine being blind, or to be blind in reality, necessitates that sight exists. Similarly, illiteracy necessitates that reading and writing exist. This is why imagining a world totally without reading and writing is so difficult. However, stepping back from our phenomenal present existence and considering writing from a distance we can see that it is strange and artificial.

Speaking and listening are deeply natural for humans. We are linguistic creatures. This is evidenced by the fact that all humans in all places speak some language or another from infancy onward. Indeed, even deaf individuals develop and use language, though not spoken, that is deeply and structurally linguistic every bit equal to spoken language. Language is what people do. Many in the modern literate society, such as the United States, would unreflectively assume that writing is just as natural, evidenced, no doubt, by its omnipresence in society. However, upon reflection it becomes clear that writing isn’t natural. It is artificial. Consider indigenous societies that even today do not write. They are non-literate societies – NB they are not illiterate societies; they are non-literate, meaning that they live without writing and not without the knowledge of writing.  Of course, many of these indigenous populations are now illiterate societies, having been brought into contact with the written word. However, it should be clear that writing is not something natural in the same way that speaking (or signing) is natural.

Writing was invented a few times in a few places, the big ones are Mesopotamia, Egypt, China, and Mesoamerica; but there are other places where writing was invented, for example Crete and the Indus valley. For a long time a theory called monogenesis ruled the day. The theory held that writing was invented once, in Mesopotamia, and spread from there to Egypt; however, more recent finds have shown that the Egyptians invented writing on their own. Obviously, China and Mesoamerica (the Maya and Aztecs) were not in contact with the Mesopotamians, and thus could not have stolen writing from them; though this was, for obvious reasons, never seriously postulated. (Now, if certain people are correct, monogenesis may yet be saved, as obviously the aliens invent writing and gave it to everyone; but until there’s actual concrete evidence for these “aliens” it’s safe to say that the Egyptians and the Mesoamericans invented writing independently.) From these inventions of writing, the idea spread and morphed along the way. Soon Phoenician traders had developed what would become the alphabet, though they didn’t have vowels in their version. These traders spread their invention around the Mediterranean. The Greeks took hold of it changed some letters from “barbarian” sounds to vowels, thus creating the alphabet the majority of the world uses today (the “Latin” alphabet is an Italian variant of the Greek and the Cyrillic is a Slavic variant) [2]. All of this should show that writing is far from natural, in the sense of innate. It may have become naturalized because of its singular ubiquity but it is, nonetheless, created, artificial. Writing is an art and a gift that those of us in literate societies too often take for granted.

We’ve lost touch with the art of writing in two senses. The first is the rather forgivable loss of appreciation for the beauty of the written form, as handwriting and calligraphy slip more and more to the periphery; however, this loss is not so great given that writing began as something merely functional in many ways, after all most early examples of writing are basically accounting records. The second loss is much graver, we have, I fear, lost touch with the art of writing in a broad sense. By this I mean that we have lost the sense that there is an art to writing; that writing is something special, that it is something to be grateful for, and something to appreciate and cherish.  I fear that writing has become something so merely functional, so basic, and so base that it has lost all meaning to most literate people. It may be unclear what this loss means, but I fear that its implications are as deep and wide. If we forget that writing is something special we run the risk of relegating it to merely another technological tool, something to be used without much thought; something, moreover, to be abandoned if something better comes along. We forget that though writing is an artificial gift, it is a gift nonetheless, and has deeply changed our world. Forgetting the art of writing is forgetting the power of writing.

It cannot be denied that writing is the most power thing people have ever invented. This claim is bold but true. Certainly, inventions like the wheel, the utilization of fire, and guns have shaped the world and qualify as important inventions. However, the knowledge of these things can only be transferred in two ways: speech or writing. Indeed, the oral tradition is the older option, useful in many cases, but severely limited in scope. In the oral tradition things are passed down generation to generation in a direct line, this means that if any one part of the chain is broken the knowledge is lost. Since the oral tradition is also limited to the size of a human community, a break in the change is more likely than with writing. In writing, knowledge can skip a generation, or more, as long as the text isn’t lost. For example, it is possible for anyone to become a Scholastic scholar even if no one else in the family ever read St. Aquinas. Furthermore, unlike the oral tradition, writing is not limited to the size of any one human community; any one can learn the language of a text and read, regardless of their membership in a particular community. An additional benefit of writing is that written information is less prone to change in meaning than oral information. One need only think of the children’s game where something is whispered alone a chain of people and the message is changed, often extremely, by the end of the chain. Writing allows not only for the widespread and, generally, accurate transmission of technical knowledge for building wheels and weapons; it also, more importantly, allows for the spread of the most powerful thing in human history: ideas.

I believe that it is ideas that rule society and history. Indeed, to quote Ludwig von Mises [3]: “The history of mankind is the history of ideas. For it is ideas, theories, and doctrines that guide human action, determine the ultimate ends men aim at and the choice of the means employed for the attainment of these ends.” There is no better way to disseminate ideas than writing, itself an idea of sorts. Writing allows for the spread of ideas regardless of context, place, time, culture, or any number of factors that limit other means for spreading ideas. With the advent of the printing press the spread of ideas in writing grew faster, freer, and wider. With the ideals of universal education and literacy that blossomed in the mid twenty century, though never fully achieved, the advance of writing was finalized as universal. It is a tragedy that in all this the medium of advance has been largely forgotten and ignored; writing hardly receives a seconds thought as it is used to spread ideas around the globe. It is hardly considered that it is writing that heralded scientific revolutions, writing that convinced men to send armies to march across the globe to advance written ideas, writing that championed the ideals of world peace and an end to war, writing that simultaneously sent people to their deaths and promised that there would be death no more. Writing is a neutral tool as all tools are, the hammer can be used to destroy as well as to build, no less can writing. Nevertheless, it is the unique providence of writing to be that omnipresent tool that is used for every imaginable end. War and peace are penned in the same medium, racism and antiracism proclaimed with the same tool, theology and atheism championed in the same form. Writing and the advance of ideas, thus the advance of history, are inexorably linked. Writing as the basic representation of spoken language in symbols is often, unfortunately, overlooked and forgotten; its unique place in history and society overlooked; however, in a different sense writing is hardly ever overlooked or forgotten.

Writing as prose or poetry; writing as essay; writing as literature; in short, writing as the thing taught in English classes, is rarely overlooked. Compositional writing is the likely the first thing that jumps to people’s minds when they think about writing. Organized writing rules the day, in books, newspapers, magazines, and even television and movies (which use written scripts). There is no doubt that organized writing is important and world-changing; however, it is thoughts of composition, of order, that remove one from the wonder of the medium itself. As important as good composition may be, it is impossible without the presence of the medium of writing itself. A well written piece is something to be admired and praised but without the letters it is impossible. It is important that in composition we forget about the letters in favor of the words, or even forgetting the words in favor of the structure of a piece; however, though this leads to good composition it also leads to a loss of wonder and appreciation for writing qua writing. It is important that we take time to learn what makes a good composition a good composition, but it is equally important that we take time to reflect on the medium of composition itself: the art of this artificial thing that was invented a few times and in a few places; the beauty and power of the invention that changed the world; the might of this tool that we call writing. Writing is the most human of inventions and tells the most human of stories.

 

[1] See here: https://www.psychologytoday.com/blog/cravings/201609/how-internet-use-is-shaping-our-brains

[2] For a more in depth history of writing, watch Thoth’s Pill by Nativlang here:  https://youtu.be/PdO3IP0Pro8.

[3] von Mises, L. (1977). Planned Chaos, p.62. Foundation for Economic Education: Irvington-on-Hudson, New York.

Should The Word ‘Very’ Really be Avoided?

There is a great deal of writing advice on the internet warning people against the use of the word ‘very.’ The reasons everyone should avoid using ‘very’ in their writing range from that ‘very’ has become so weakened that it has no intensifying purpose anymore to the claim that using ‘very’ is simply lazy writing. It ought to be noted that in none of this writing advice do people give a legitimate stylistic or grammatical reason to use their suggested alternatives in place of an adjective modified with ‘very.’ Claiming that ‘very’ is a weak or lazy word is not really a stylistic justification for avoiding it; in fact, there may well be stylistic reasons not to avoid using ‘very,’ as the aggressive use of large words can make one’s writing seem awkward, or like they have just discovered how to use a thesaurus. To be clear, large and complex words have a clear and important place in writing; however, they should never be used simply to avoid the word ‘very.’

Indeed, if one of the supposed reasons to avoid ‘very’ is that it has become so weakened to lose all meaning, one ought to avoid the intentional overuse of words to replace ‘very.’ In fact, the intentional use of replacement words to avoid ‘very,’ does more to damage good written style and language use than the “overuse” of ‘very.’ To quote C. S. Lewis: “Don’t use words too big for the subject. Don’t say ‘infinitely’ when you mean ‘very’; otherwise you’ll have no word left when you want to talk about something really infinite.” Furthermore, many of the alternatives suggested to replace the use of ‘very’ actually lead to a difference in meaning between the original adjective modified with very and the alternative. Let’s take some examples (from the infographic found here):

“very afraid: fearful;” the problem with this one is that fearful and afraid mean exactly the same thing, fearful does not imply a greater intensity of fear than afraid, therefore if one wishes to express that someone has intense fear they could not use fearful in place of very afraid without failing to convey their actual meaning. [1]

“very boring: dull;” dull does not meaning the extremely tedious or uninteresting, i.e. very boring. In fact, dull has more senses than boring and to replace the “very boring” with “dull” could VERY easily (notice that I didn’t use “effortlessly”) completely alter the meaning of a sentence. Example: “that professor is very boring” [meaning: the professor is extremely tedious] changed to “that professor is dull” [possible meanings: (a) the professor lacks excitement; (b) the professor is stupid].

Oh I love this next one:

“very dull: tedious;” that’s right folks, when you want to intensify an adjective don’t! Instead, use one of the possible definitions of the un-intensified version of the original. Dull means tedious! Therefore, it is impossible for tedious to mean very dull, since if it did dull would also already mean very dull!

“Very colorful: vibrant;” this doesn’t work, since vibrant refers to a color’s brightness, whereas colorful refers to the amount of colors or the brightness of something; so to replace very colorful with vibrant is to lose not one but two meanings as one loses both the reference to amount of colors and to the intensity, since vibrant does not mean ‘intensely colorful.”

“very perfect: flawless;” the problem with this is that instead of replacing the phrase “very perfect” with a synonym of perfect, one should just cut the word “very” from the phrase, as it is simply redundant.

There are some on the list that work well like “very stupid: idiotic,” as idiotic means very stupid. However, the biggest flaw of this list is that many of the replacement words of synonyms of the original adjective without adding any intensity. Indeed, to get the same sense out of “tedious” as out of “very dull,” one would have to say “very tedious.” I fully agree that having a larger vocabulary is a positive thing for which everyone should strive; however, the way to get there is not to dispense with the use of the word very and replace it with “better” alternatives, since that is not the sign of a larger vocabulary but a sign of a thesaurus user. Meaning is nuanced and complex, different words mean different things to different people, part of having a large vocabulary is welding it well, not shoehorning words in places they don’t really fit. Perhaps, for some “tedious,” does, in fact, mean “very dull,” but that still doesn’t change the fact that in everyday speech and writing there is a place for “very dull.” If it is the most efficient way to get one’s meaning across, and one doesn’t have some other commitments in writing (class style guides for example), use the words and phrase most fit for the writing.

[1] I am using the definitions of Oxford Dictionaries online.

One further point, despite the widespread belief among people using larger words in writing doesn’t actually make one sound more intelligent and may actually have to opposite effect if overused. Thus, one should use the words one thinks best fit the situation. See the study by Oppenheimer, D. (2006). Consequences of erudite vernacular utilization irrespective of necessity: problems with using long words needlessly, Applied Cognitive Psychology 20, 139-156. DOI: 10.1002/acp.1178.

Random Thoughts on Politics

This is another collection of short pieces on my random thoughts, this one happens to be political in nature. I understand if you dislike politics and don’t want to read this, I swear that I will be returning to languages soon and then some philosophy stuff. Anyways here are some random thoughts on politics.

Immigration and Refugees

This is a big topic in light of recent events and yet I can only bring myself to barely care. That probably makes me a terrible person, oh well. The reason I barely care is that in my ideal world neither of these things would be issues, but this isn’t my ideal world so they are. I’m for free and open immigration, I’m for letting refugees enter the United States; however, I’m against all forms of the government getting involved in either of these matters. There are ways of privately helping refugees and those are what we should be engaged in. No immigrant should receive government money, not because they’re immigrants but because governments and by extension government money shouldn’t exist.

I know I’ll be criticized, if only silently, by many. One criticism is “what about national borders, national sovereignty, or national culture.” Granted the last one usually doesn’t have the word national tacked on the front but it is certainly implied. Well, I don’t care about national anything because I don’t believe nations should exist, not in some weird one-world government meaning of the phrase but in a localist way that everything should be run locally. Simply put, I don’t think governments, certainly not national governments, should exist and therefore I don’t believe in the nation-state as a justification of anything. Oh and on the culture thing, people seemed worried about “losing their culture.” I’m really not sure what that means. Culture is not a stagnant thing, it is a constantly changing process, a negotiation made each and everyday by each person. Another criticism is that “we don’t want violent people coming into this country.” On some level I agree, I mean no-one should be violent period. That actually leads me to disagree on a much deeper level with this sentiment. First, there are already violent people here, I doubt it will be that much worse if some people come into the country. Second, as I stated above I don’t believe in the whole nation-state thing so there is no “our country,” there is a piece of geographically territory that is ruled by an entity founded on force that for some reason everyone insists on asserting is “one and unified.”

Maybe you can see why I am basically apathetic on this issue. People seem to want everyone to adopt a pro-immigrant/refugee policy stance of an anti-immigrant/refugee policy stance, but either way they want you to have a (governmental) policy stance. That makes it tough for me, because both sides are wrong due to the fact they both of them believe that government must be involved; whereas, I don’t.

Hate Speech and Free Speech

I have declared before that I am a free speech absolutist and I am. In my opinion all speech should be free including what is commonly labelled “hate speech.” This is not a generally accepted or even tolerated position, which to my mind shows the lack of nuance in people’s thinking. Let me explain. To begin with let me state plainly: I despise bigotry and despise bigoted and hateful speech. However, that does not mean I think it should be banned or in any way silenced. This is where people seem to lack nuance; many seem to believe that if something is terrible, evil, or vile that it ought to be banned or silenced. However, this creates more problems than it solves.

When speech is open and free there is more responsibility. Silencing speech removes responsibility for that speech, the speaker of hateful words goes into hiding, makes all their comment anonymously, and never takes responsibility for their speech. Furthermore, silencing removes the possibility of openly combating the ideologies that lead to hateful speech. Moveover, silencing does not kill hatred, it grows it.

Let me be clear: I understand the psycho-emotional damage of hateful and derogatory speech. Hateful speech is despicable; however, that doesn’t mean it should  be silenced. No, it should be made openly and combatted openly and decisively. Moreover, it must be combatted with respect if for no other reason than to be unlike the bigot. As Marcus Aurelius wrote: “The best revenge is to be unlike him who performed the injury.” Hate speech is disgusting but it is free speech and thus must be allowed. By the same token, it must also be decry and freely combated at every injunction without silencing or disrespect. If allowed to be openly express it is unlikely to last long in the market of ideas, as Louis Brandeis said: “Sunlight is said to be the best of disinfectants.”

Note: Hateful action is not speech. Violent must never be allowed. It must always be denounced.

Respect and Listening to Each Other

It seems as though no one can listen to each other any more, if they ever could. In all matters of disagreement people seem to only shout at each other and never engage in meaningful and free discourse. Moreover, not only can people not listen to other viewpoints, they feel compelled to constantly insult and belittle everyone that disagrees with them. There never seems to be respectful discussion instead it is merely insulting each other. All sides engage in this shameful practice and it works because it captures people’s emotions. However, at the end of the day respectful discussion and debate have better results than emotional appeals and insults. Not that this is anything new. Insulting opponents seem to have always been a tacit, but they’ve never been a tacit that should be accepted. Hopefully, we can all try to be better at listening and being respectful even of those we most disagree with (i.e. don’t call people names!).

Taxation and Federal Programs

I recently saw a post on a social media platform that asserted the minimal cost of certain “threatened” federal programs, including schools, museums, and arts funding. The post asserted that the cost to fund these various programs is only around 22 dollars per year for each taxpayer. It also asserts that the posters are happy to give up this money to keep these programs in operation. Interestingly, the post uses the correct terminology for taxation, saying “please take my $xx.xx;” take is the correct verb since taxation is theft. However, that is not the part of the post that made me want to write about it. I want to write about this post because it shows an odd twisting in logic. Imagine if someone said that these programs should be privatized, run not by government but by private means. There would be massive uproar, likely from that same people saying that they are happy to have their money taken to fund these federal programs. They would claim that if these things were privatized no one would give them money. However, they have asserted that they are happy to have their money taken to fund them, so by simple logic they should be happy to fund them privately. If you think that the government should fund something, then you should be able to see that it will be funded privately! If you are happy to have your money taken to fund a federal arts program then you should be equally happy to fund a private arts program. Unless of course, you just say that you’re happy to fund all these things because you think it makes you sound decent, civilized, or cultured, when in actual fact you don’t give a damn about whatever it is you believe the government should be funding.

The Federal Department of Education

Upon the confirmation of the Trumpian Secretary of Education there has been an outpouring of discontent. Justifiable or not, people dislike the new Secretary of Education for various reasons for her policy proposals, her lack of experience, or just the fact that she’s a Trumpian. Nonetheless, I don’t take issue with people that disapprove of her, nor for that matter people that approve of her. Either way, they’re wrong. I don’t care who’s in charge of the department, because the department shouldn’t exist. I’ve now uttered, actually written, the fatal words. How dare I claim that the Federal Department of Education not exist! Think of the children! Apparently, the Federal Department of Education is the only thing keeping children in the Bible Belt for being openly taught creationism in science class, ya know, because parents and teachers are too stupid to make decisions. That’s the point, I’m at most a localist, I believe things should be run locally, in fact, I would say we should run things at an even smaller level, but that’s a different comment for a different time. Schools should be run locally and by and large they already are. The Department of Education has rules and regulations, sure, but if you really have so little faith in the states (especially southern states) to educate without them, do you really believe they aren’t already ignoring as many rules as they can get away with? There are going to be bad schools with or without the Department of Education, and the benefit of not having it is that you wouldn’t have to worry about a Trumpian being in control of it.

Everything’s a Metaphor

The word metaphor is generally refers to the figurative use of one word or phrase to describe another unrelated word or phrase. It is non-literal. For example, “time is money,” is a metaphor, because time is not literally money. However, as the title of this post suggests I am using the word ‘metaphor’ with an expanded, (you might say, metaphorical), meaning. Let me explain why everything’s a metaphor.

When someone says, “I like this coffee,” what do they mean? This is fairly cut and dry, no metaphor in sight; they mean just what they have said, i.e. they enjoy the coffee they are drinking, just finished, or saw the bag of, etc., which precise meaning is context dependent. That’s all very well and good, but what do they really mean? That may strike the reader as a strange question but here’s my point. The metaphoric is omnipresent in everyday use of language. When one says, “I like this coffee,” what they mean to convey is the nebulous and impossible to precisely define conception of enjoyment of “this” coffee. Ask them why they like this coffee and you may be treated to a soliloquy on its aroma, taste, mouthfeel, or some memorial-emotional connection. Yet, ask them why they like that taste, that aroma, etc. and most will be at a loss to explain. A great deal of human experience is hidden from conceptualization by both outsiders and ourselves.

Thus, everyday language, and even more precise academic language, cannot capture everything one means. At best language can hint at the outside world and the internal mental world.  I think this hinting is best described as “metaphor.” Furthermore, much of people’s everyday speech is not as direct and simple as, “I like this coffee.” A great deal of the time, interpersonal communication, especially among friends and family, involves shared secrets, inside jokes, and communicative short-cuts.  Here, there is yet another level of abstraction, thus another level of metaphor.

One criticism of this view maybe that it is a little thin on explanatory power. For example, of what is the phrase “I love you” a metaphor? Well, it is a metaphor of the experience state of the feeling of love for the loved person by the speaker. Great, says the detractor of my view, but what exactly does this actually explain? Here is the problem; it doesn’t really explain anything, because it cannot. Language is metaphorical, therefore trying to explain on metaphor leads to another metaphor and on and on. This is why even the very statement that everything’s a metaphor is a metaphor. Where does this leave us?

This position does nothing to one’s everyday life. Language is still the same; ideas still remain as they were. Might this position affect one’s world view? Perhaps, but it needn’t. Just because language is ultimately a collection of imperfect metaphors about the world, doesn’t mean that knowledge is unattainable, or that this or that thing doesn’t exist, or that language isn’t one of the best tools (if not the best) we have in life. As Haruki Murakami wrote: “A certain type of perfection can only be realized through a limitless accumulation of the imperfect.”