Philosophizing Now: Graham Harman Interviewed
by Open Ideas
Intro: We welcome Graham Harman of the American University of Cairo, the most well known protagonist of ‘Object Oriented Philosophy’ – a system of thought which takes ‘things’ to be central to existence, and which classes humans as just one of those things. Harman cut his teeth reading Heidegger in his teens, and it was his new approach to this often misunderstood philosopher that gained Harman recognition with his books Tool-Being and Guerrilla Metaphysics, as Harman carved out a philosophy in which the interaction of objects with each other is seen to be as important as the interaction between humans and objects. His latest book – ‘Prince of Networks: Bruno Latour and Metaphysics’ – focuses on his mentor and friend, philosopher Bruno Latour, before going on to explore the possible future impact of an object-focused form of thinking. Harman blogs prolifically at ‘Object-Oriented Philosophy’, where his ability to mix mundane and extraordinary events on and off campus in Cairo with philosophical musings and insights has gained him many followers and, inevitably, a few detractors! This year will see him release ‘Circus Philosophicus’ with ZerO Books, a collection of essays which promise to push the stylistic boundaries of philosophical writing.Mike Watson thanks Graham Harman for taking time out of his busy schedule to answer six questions:Q1: Philosophers are, by nature, ‘pedants’. One envisages them as a stuffy and fixated group of individuals obsessed with petty abstract details and withdrawn from the real world. How does this meet with your image of philosophy?GH: I’m becoming more of a historicist on this question. Today in philosophy we don’t have so many pedants: what we have instead are trolls. It recently occurred to me that each era of philosophy has generated its own type of anti-philosopher. In ancient times this was the sophist, and so much of Plato’s thought is devoted to assessing and combating the problem of sophistry. And I’m with Plato on this point: the sophists are bad. For some decades there has been a fashionable revisionist view that there is much to be said for the sophists. But in this case I agree with Socrates and Plato: the sophists are anti-philosophers, and must be defeated.As philosophy became more institutionalized in the Church and more dogmatic in character, with Aristotle the ultimate underpinning in that period, the figure of the pedant emerged. The pedant is famous for vast but empty learning and a love of hair-splitting verbal disputes. Just as Plato combated the sophists, it was Giordano Bruno (another of the small number of literary geniuses in the history of philosophy) who waged war against the pedant. His dialogues are filled with mocking portrayals of pompous, Latin-spouting scholars who never get close to a genuine philosophical issue.These pedants never harmed Bruno, but he was ultimately destroyed by another great figure of anti-philosophy: the Inquisitor. With this figure, dogma is no longer just the pursuit of worthless erudition, but rather the physical destruction of those who deny the content of truth upheld by the Inquisitor. Bruno was tortured in dungeons for nearly a decade, and finally burned at the stake in Rome. This is not likely to happen to any 21st century philosopher in the West, though it could conceivably happen in a few contemporary societies.In our time the word “sophist” is a frequently heard insult, but there really aren’t as many sophists around as one believes. Pedants today are mostly harmless and comical, not serious power brokers in the Academy. And as mentioned, Inquisitors don’t really exist in the West anymore, though many people still like to fantasize that they are being persecuted by mighty Inquisitors. By and large, we in the West are now fairly free to say and do as we please in comparison with the past.And that brings us to the truly dominant anti-philosopher of our time: the troll. As far as I know, the word gained its current meaning in internet culture. The internet gives the troll the great advantage of anonymity, but it is not essential that one be anonymous; it merely helps the cause. The modus operandi of the troll is what Mark Fisher aptly called “the sneer from nowhere.” The troll never puts forth a positive thesis or has anything at stake, but merely critiques and cuts down. The troll usually doesn’t believe the things he says (I say “he” rather than “he/she,” because the troll is an almost entirely male personality type, and by the laws of primate society it is usually a young and unaccomplished male trying to gain group status by knocking down more visible figures). The troll doesn’t doubt things because they are truly dubious, but because one might conceivably doubt them. The troll seeks only to take away and lessen, not to produce. It is “devil’s advocate” turned into a full-time lifestyle, with more than a bit of malice thrown in.My theory is that the troll is simply the predictable excrescence or repellent underside of an era of philosophy that values critique far too highly. Even university administrators praise philosophy mostly because it teaches “critical thinking” skills. In short, it is believed that philosophy teaches us to be less gullible, to believe in quantitatively fewer things, to stand at a transcendent distance from any particular personal commitment. The mission of philosophy is to debunk and tear down and to say: “no, I don’t believe it.” Against this attitude, I agree with Latour’s maxim that the point of thinking is to make things more real, not less.One of the two or three greatest philosophers of the past century was Alfred North Whitehead, and in the opening pages of Process and Reality he takes a marvelous approach to this problem. According to Whitehead, logical blunders are a surprisingly rare problem in philosophy, and even when detected they are not usually too difficult to fix. Moreover, philosophy is not like geometry. In geometry, one shaky inference causes the whole theorem to crumble. That’s not the case in philosophy. There is a certain component of deductive reasoning in philosophy, but it is not at all paramount. If Heidegger or Decartes adopt the wrong starting point, it doesn’t really matter, because much of what follows may be of value anyway. There is something autonomous about each stage or layer within a philosophy, in a manner that that is not true of geometry or the other mathematical disciplines.I’m afraid this is also the problem with analytic philosophy as a discipline. Analytic philosophy is devoted to the view that a philosophy is a “bundle of arguments,” just as British Empiricism held that an apple was just a bundle of qualities. This attitude does give analytic philosophers a refreshing freedom from the past: after all, it is not hard to find “bad arguments” in Plato, and my 16-year-old freshmen do it semester after semester with no assistance from me. But Whitehead was right to observe that the problems with philosophies do not come from bad arguments, but from over abstraction, incoherence, and lack of imagination. You do not become an important philosopher by generating the most “knockdown arguments” to annihilate other people in aggressive oral combat. You do it by showing that the center of gravity of the world is somewhere different from where people suspected it was. You do it by outflanking the trench wars that waste the hours of angry intellectual disputants, and coming up with new alternatives that no one ever expected. But instead of Whitehead’s vision, we still have too much locker room machismo in philosophy today, and increasingly even on the continental side of the fence. But I find it repugnant, and deeply unphilosophical.The troll poses as a radical critic of all, and as a skeptical and independent spirit. But he is in fact the perfect instrument of the status quo. If every new idea that springs up is subjected to instant critique and devil’s advocate assaults, what survives will be a “lowest common denominator” philosophy based on the reigning platitudes of the moment. The reason I don’t think the troll is very dangerous is because I don’t think the era of philosophy-as-critique has much longer to go. It is deeply rooted in Kant’s Copernican Revolution (though Kant was no troll himself, but one of the greatest philosophers), and I think the Copernican Philosophy has finally reached a dead end.Q2: One assumes there are few limits to your ambition. Do you ever reflect and think ‘yes, I seem to have gotten somewhere here,’ or is it rather a case of moving ever onwards? That is to say, when is a philosopher’s (or one could apply this question equally to artists, writers or film makers) work done?GH: At present I am 41 years old (still young by philosophy standards) and there is plenty of work to be done. I’ve still not written anything that makes me fully happy, and I want to be much further along ten years form now than I am today. That said, if I were hit by a bus tonight and killed, then I do feel that I have already written some things of value that could be built on by others. Steven Shaviro and a few other prominent readers said that Prince of Networks is my best book so far, and they may be right. It’s certainly my most advanced. Aristotle claims that age 51 is the intellectual peak of life, and if he’s right, then the jury is still out on me for another decade. After that, I will gradually morph into another familiar establishment figure whose virtues and limitations are largely clear, with a new generation rising by then that will have read all my books and have become somewhat sick of them. That’s how philosophy makes progress: by younger people becoming sick of the insights of their elders. This is no flippantly rebellious process, but a necessary one: our elders can’t ever quite address our intellectual needs fully, because the reality they had to address in their work is so different from the one that we ourselves face.As for your question about moving forward, there is always a tension in any philosophy between forward movement and repetition. On the one hand I agree with Zizek that philosophers are insane people who say the same thing over and over again. There is always a center to your thinking that you have to orbit as closely as possible, and so you end up repeatedly reformulating a handful of basic principles. We are not linguists or ornithologists, constantly collecting additional new data. Rather, philosophers are specialists in simplicity.On the other hand, forward motion is necessary. And this comes most often through surprise and trauma. As for surprise, I find it crucial in all intellectual work to focus on the surprises you encountered: what didn’t quite fit your preconceptions, and how you had to change your attitude to account for what you found. One often runs into people who like to say “I’m not surprised by anything,” but that is a pseudo-intellectual stance, not a serious one. If you are paying close attention to any topic, there are going to be important surprises that test your agility and openness as a thinker. This love of surprise is also why I like to travel so much. No place ever turns out to be anything like the way it was described in books. Even photographs are quite poor at portraying the feel of a place. In 2007 I made my first trip to the Far East, and was shocked by both Tokyo and Seoul, for instance, since neither had ever been accurately described in anything I had read. (Tokyo isn’t the dense megacity I had imagined throughout my life, but more like a series of 8 or 10 medium-sized cities connected by rail. And the big shock about Seoul was the great natural beauty interwoven with it. Despite being the jam-packed megacity that Tokyo is not, Seoul has a big mountain not at the edge of the city, but right in the middle of it! And a large and forested national park begins just blocks from the city center! No travel book had conveyed these things to me adequately.)As for traumas, those usually come from the opposing views of other neighboring thinkers. No two people will ever have the same philosophy; knowledge is too personal an experience for that. And for me at least, a bit of time is necessary to digest the differing views of others. I am distrustful of those who too quickly have words for very situation. Words should be a bit of a struggle, if you’re honest. For instance, the encounter with Quentin Meillassoux was one of these fruitful traumas for me. We hit it off well personally, and enjoy reading one another’s work very much. We even graduated from high school in the same year! And yet we really don’t have very much in common, philosophically speaking. Meillassoux does make the classic critique of correlationism, but he still thinks it’s the unavoidable horizon of rigorous philosophy: to think a tree outside thinking is to think it, and therefore we cannot think a tree outside thinking, but are trapped in the correlationist circle. We can only hope to radicalize it from the inside rather than escape it. For Meillassoux this is the hallmark of intellectual rigor; for me, it is the supreme horror and the abdication of philosophy. But it’s already taken me three years to figure out exactly why I can’t accept Meillassoux’s philosophy, wonderful though it is, and the main reason I agreed to the request of Edinburgh University Press to write a book on Meillassoux was to come to terms with the trauma by trying to put our exact differences into words.More generally, you ask about how philosophers know when their work is done. I was greatly influenced by Ortega y Gasset’s generational conception of history. He sees generations as made up of 15-year blocks, but one needn’t accept that detail to find value in his theory. Namely, there is a big difference between those ideas that grow out of our true inner needs, and those inherited ready-made ideas that we merely sling around in verbal form without having earned or even needed them. Ortega’s terminology here is “authentic” vs. “inauthentic,” which he used before Heidegger did. He believes that up until age 30, philosophers are primarily just receptive, reading books and hearing lectures by others, though he thinks that at around age 26 the first shadows of original personal ideas start to appear. He then sees roughly ages 30-45 as being the period when one is an emerging maverick outsider, fighting the ruling generation that is roughly 45-60 years old. This is not an empty power struggle, but a reflection of the fact that things that were once fresh, new, and promising become stale established discourses for those that follow: Derrida must have been liberating to some people once, but by the early 1990’s when I entered graduate school, he was just another oppressive established authority, protected by armies of loyal enforcers.If anything needs to be fixed about Ortega’s schema, it’s that everything has probably shifted a bit older on the spectrum. I’d almost now say that the established generation of philosophers is from ages 60-75, with 45-60 being the up-and-coming mavericks, and 30-45 being more a stage of extended adolescence when people are still getting their act together. Or maybe there are three generations in play now rather than the mere two of Ortega’s model, due to increased lifespans. But I’m sure that up to age 30 is primarily still a period of study and absorption, and for this reason I never judge people on what they do before 30. That’s mostly dress rehearsal; it’s what comes later that counts.And now, you might ask, did I experience my own first personal ideas at age 26 as Ortega suggests will happen? The answer is yes. I was writing a paper on Levinas at that age, was following his theory from De l’existence à l’existant that reality itself is one and that it’s broken into pieces only by human consciousness, and suddenly I had the vague realization that this was nonsense. There are good reasons to say that the world itself is already a plurality, and that the parts of this plurality interact with each other no less than with the human mind. It took two or three more years before I consciously realized: “I do metaphysics, I am anti-Kantian, I am a realist.” But age 26 was the start of it, just as Ortega claims. And those will always be lonely years, because it always takes time for words to catch up with insights. Clear verbal formulation of thoughts is a goal, not a starting point (another truth that analytic philosophy fails to grasp).Q3: Philosophy has been written for millennia. What would you define as the unique characteristic of philosophy today?GH: The easiest way to answer this question is to ask who is the most unique great thinker in recent times. And for me, as you know, that is Heidegger. Philosophy in the 21st century will live or die based on how adequate our response is to Heidegger. And this response requires that we look both at his strengths and his weaknesses.What are the great weaknesses of Heidegger as a philosopher? Not his Nazism, and not his tendency toward obfuscating mystical language. Those are bad enough, but they are not the heart of the issue. No, I would say that the two major problems with Heidegger lie elsewhere:1. He remains too obedient to Kant’s Copernican Revolution. The human-world pairing still remains at the center for Heidegger, who offers no resources at all (unlike Whitehead) for bringing object-object relations back into philosophy that do not involve humans as one of the terms. The usual tendency is to assume that the natural sciences are doing just fine with object-object relations, and hence that philosophy deals uniquely with human experience, which science supposedly cannot account for. But then cognitive science comes along and claims that even the human sphere can be treated scientifically, and everyone starts screaming “Yes!” and “No!” very excitedly at this program. But most of its shock value results merely from the initial stupid decision to treat human experience as the unique topic of philosophy. If like me you don’t even think the natural sciences give a good philosophical account of inanimate relations, then a scientific treatment of cognition is hardly going to feel any more threatening than a scientific treatment of fire. It will be no more adequate to one case than to the other.2. He remains too dismissive in his treatment of specific beings. Heidegger merely wants to tell us that all distinct entities are undermined by a prior ontological realm. As a result, all individual monkeys, trains, coconut trees, and planets are submerged in a single monotonous sea of Being. And that’s why Latour is such a refreshing supplement to Heidegger, because Latour can literally discuss any kind of specific entity.But more importantly, what is the greatest strength of Heidegger as a philosopher? As I have argued from Tool-Being onward, Heidegger’s great distinction between ready-to-hand and present-at-hand is one that lies not between “practice” and “theory” (this would be both trite and unsustainable). Instead, it is a distinction between reality and relation; and since reality is plural, we can say that the distinction is between objects and relations.No model of a thing will ever grasp it. No knowledge of a thing will ever drain it to the dregs. No causal relation between two entities will ever allow them to make perfect contact; each entity always has depths unsounded by the other. To be is not the same as to have qualities discursively knowable by someone, not even if that someone is God. To know all the qualities of a tree (if that were even possible) would not turn us into the tree itself.This is the great breakthrough of Heidegger’s tool-analysis. But in some ways it is merely a return to the Socratic insistence on philosophy as a love of wisdom, rather than as a supposed wisdom obtained from language, mathemes, natural science, or analyses of “power.” Nor is Socrates giving us an epistemological theory on the limits of our species: “Oh, we wretched humans, who will never know anything, unlike the gods!” No, the point is that no relation to a thing can ever replace that thing, even if humans are left out of the question.Another way of putting it would be this… We need a realism without a correspondence theory of truth. If you think that feeble human knowledge can adequately exhaust the inner life of things by doing nothing more than listing a few hundred measurable qualities, then your sense of reality is insufficiently robust. You are not interested in realism, but in celebrating the hard sciences and using them to beat up the softy poets who supposed ruled continental philosophy in the past. Reality is inherently something deeper than any relation that might be had to it. Aggressive neurology is no substitute for metaphysics.Q4: Your philosophy is ‘object-oriented’. That is to say that you consider the object to be as important as the (human) subject. Many people think this is dangerous, as activity in the blogosphere testifies. I think the issue here is that if the subject is considered as a mere object amongst others, people feel that humans might find it easier to justify abuse of other humans. What do you say to that?GH: Freud always claimed that psychoanalysis was the third affront to human dignity in modern times. Copernicus moved the earth out of the center of the universe. Darwin made us no more special than animals, plants, and fungi. And Freud made conscious thought derivative of less palatable underground currents in the psyche. As a fourth supposed affront to the dignity of humans, let’s add the notion to which you just referred: that the human is not metaphysically special either, so that my perception of fire is no different in kind from the relation of cotton and fire among themselves. Cognitive and causal relations all end up on the same footing. And it does seem to be a bit of a traumatic claim for people, judging by how upset they have become about it (I wasn’t expecting this to happen).And just to be clear, I cannot take credit for that fourth trauma. Whitehead is the one who did it most clearly on the post-Kantian landscape, where it was such an unwelcome gesture. My own contribution is simply to replace Whitehead’s relational model of things with Heidegger’s model of things withdrawing from all presence, while retaining Whitehead’s cosmic scope and dumping Heidegger’s human-centrism.I’ve heard the critiques to which you refer in your question, and just can’t make much sense of them. Some people have even claimed that object-oriented philosophy says that “humans are worthless.” I don’t get it. Did Copernicus say: “The earth is not the center of the universe, and therefore it is worthless”? Did Darwin say: “We are related to apes, and therefore we are worthless”? Why this all-or-nothing model, in which humans must be everything or nothing? There’s a lot of open ground between 0 and 100. To flip wildly between saying that humans are the best or the worst, just like political factions who flip between saying that the United States is either the greatest or most evil country on the planet, reminds me of one of the best-known symptoms of Borderline Personality Disorder. The truth is neither 0 nor 100. The truth is that humans are not metaphysically special just because we happen to be humans. Nonetheless, humans are probably still among the most fascinating entities in the universe, and even if we weren’t very interesting in the absolute sense, we would of course still be interesting to ourselves. Iowa surely does not have the most interesting history of any American state: it fascinates me because that state happens to be my ancestral home. But it would be idiotic of me to insist that Iowa is either the best or worst state in the Union, or flip between the two claims at different times.Q5: What would you say to an aspiring postgrad student about to leave the safe confines of study for the real world? Supposing this person is shortly to face the human jungle that is academic work (lecturing, publishing in journals, addressing conferences, etc) what are, in your mind, the skills needed in order to thrive as a humanities academic?GH: As you know, I’ve given a great deal of advice to graduate students on my blog. The main reason is that I had a terrible time as a graduate student, not enjoying it at all. Those were great years of intellectual discovery for me, but I felt as alienated as one possibly could from both my professors and the institutions I attended. And look at me now: I’m an administrator! It would have been unthinkable even 10 or 12 years ago.The first thing these students need to know is that the beginning is the hardest part. Most of the people who fail in academia go down in flames through self-destruction. Namely, they never finish their dissertations. The human mind is clever at inventing alibis, and even at producing real excuses. Avoid graduate school psychodramas. There are a handful of tyrannical professors in most graduate programs, but your fellow students already know who they are, and you will be warned in advance. Listen carefully to the warnings. Don’t think you will be the special exception, because you won’t be: tyrants are tyrants.The next variant of self-destruction is perfectionism. One of the wisest clichés I know is “don’t let the perfect be the enemy of the good.” Your goal should be excellence, not perfection. No perfect philosophical book has ever been written; indeed, even my often 16-year-old freshmen in Egypt make mincemeat out of all our great authors within weeks of their very first philosophy lesson. Perfectionism is the subtlest and most corrosive alibi for not getting any work done. It grants you a false sense of moral superiority, and if you wish, it even allows you to troll those who are getting much more done than you are. If I’m proud of anything about the darkest periods of my student years, it’s that I never ceased to admire those who were getting more done than I was. Never did I resent them. Instead, I wanted to be like them when I grew up, and now I finally am like them.I have no time for academic careerists: people who choose their advisors and institutions and topics based on professional connections and likely job payoffs. The life of the mind must be primarily something other than a career. But there is also nothing wrong with success, and success should not be a badge of shame. I’m certainly much happier today than I was at age 25, and that’s mostly because I’ve found a niche in the world where I can do the work I love while also earning my meals. Once you reach a position like that, the sky’s the limit. It’s up to you to capitalize on your situation.The most inspiring advice I ever heard given to students was from Alphonso Lingis, the colorful American philosopher who was my M.A. advisor. I wasn’t personally present when he said this, but there’s no doubt he did: “Go outside on a clear and starry night,” he said, “and try to get some sense for the vastness of the universe. And consider that your fingerprint is enough to make you unique in all the universe. And then consider how much more complicated your brain is than your fingerprint… Your brain is wired to do something that nothing else can do. And if you don’t do it, it’s not going to get done.” Take that advice to heart, and do what you are wired to do that no one else can do.The Academy has its tyrants, oppressors, and banal functionaries, yes. But it leaves plenty of operating room for independent thinkers as well. It gives you unequaled schedule flexibility, ample vacation time, stimulating contact with students, and often generous travel and grant support as well. The Academy tends to make ambitious and sensitive young people feel roughed up and abused (I certainly felt that way), and many of them run away or never recover. But I’ve found that it only gets easier with time; your twenties are an artificially difficult decade in the university system. Why pay the bill at a restaurant and then leave before eating what you paid for? That’s what it’s like if you suffer through graduate school and then quit before things start to get interesting and comfortable.Q6: Your blog is key to the promotion of your work. In what way do you think the internet has changed philosophy? How might this impact in the future? So far, I think it’s mainly changed the rate of dissemination of philosophical trends. Speculative Realism has been the best example so far: the rise to prominence of this movement, of which I was a co-founder, could never have taken place so quickly without the internet. And since internet usership is higher among the young, the blogosphere has certainly mobilized and legitimated the young like never before in history. Young students like Reid Kotlas and Nick Srnicek have book deals already, something I couldn’t remotely have dreamed of at their age, since the era of my youth was one of slow-moving brick-and-mortar publishers acting as stern gatekeepers, just as it was for several centuries before me.But I still don’t think the blog is a mature medium for the expression of philosophical work itself. Levi Bryant comes the closest to having a true philosophy website. If you read his Larval Subjects philosophy blog (and I know you do), you’ll see what I mean. Levi writes full-fledged mini-treatises, and they are both stimulating and readable. I think his blog is the model for the best of what might be possible. But I just can’t do it. Instead, I prefer to use my blog as a mixture of daily observations and philosophical snacks. When I’m in the mood to do my best work, I turn to traditional word processor genres: the article, the book, the lecture. Maybe that won’t be the case forever. Maybe we will soon reach the point where Levi’s model is the primary way in which philosophy is done. What would it take? Probably just a practical technical innovation: I don’t always feel like reading a technical philosophy blog post on a computer screen, but if there were some easy way to download them to pleasant portable devices and read them there, then I might do it. I realize there may already be ways to do that to some extent, but that’s not where the group culture is at the moment. Instead, the group culture is still divided between blogging on computer screens and writing traditional academic works for paper.And obviously, the technical innovation of the blogosphere creates an automatic generational watershed in philosophy. I find it hard to believe that the aging lions of SPEP are spending much time browsing the blogosphere, and that means they are aging ever faster the more that things evolve quickly in a medium of which they are largely unaware.