It seems apropos to introduce a small point of order: New APPS is a group blog, which means that there are many authors here and we all speak for ourselves--and only ourselves.
A case in point would be my strong disagreement with Jon Cogburn's post below. I find it to trade in a series of unfortunate false dichotomies: 1) between valuing or appreciating ability and seeking to avoid speaking in a way that may be hurtful offensive to people with disabilities, or which marginalizes them; 2) between recognizing that illnesses (mental or physical), injuries, or other afflictions are real sources of suffering and seeking to avoid speaking about people suffering from such conditions in a way that marginalizes, delegitimates, hurts or excludes them; and more generally, 3) between being able to express oneself adequately or take joy in life and seeking to avoid harming others carelessly or thoughtlessly, especially where they may also be subject to various systems of marginalization, delegitimation, or exclusion.
I also disagree with Jon's suggestion that some of our former bloggers were wrong to push as hard as possible for the development in the profession and among those who engaged with us here of a much greater degree of sensitivity and care with respect to how we speak about folks who have historically been marginalized, delegitimated, and excluded by the profession and by the history of 'Western' philosophy.
Increasingly, when I see someone accused of "ableism" because of some inartful (or perfectly fine) turn of expression, I become angry. It just strikes me as Forrest Gumpism. Everything is really peachy, as long as we confine our discourse to positive platitudes (and attacking those who don't so confine themselves).*
But all else being equal, it is better to be able. Speaking in ways that presupposes this is not bad, at least not bad merely in virtue of the presupposition (see also the Johnny Knoxville/Eddie Barbanell video below).
The place where my son gets occupational therapy (to deal with a bunch of sensory processing disabilities he inherited from me)** is called "Abilities." Good for them! I don't want my child to suffer as much as I do. The thought that I should feel guilty for that, or feel guilty for expressing something that presupposes it, just strikes me as insane. And I don't feel guilty for saying it strikes me as insane. To not be able to use "insane" as a derogation when it is appropriate would be to lose sight of the fact that it is horrible to be insane, which would in fact be extraordinarily cruel to the insane.
My friend Justin Isom dealt with his blindness and cancer with incredible dignity. He played a very bad hand extraordinarily well. But any pretense that it was not a bad hand would have been insulting and condescending (just as he would have taken, on the other side, excessive pity to be condescending). Justin thought it was hilarious when I first squirmed about saying "see you later" to him. When you have a blind friend you realize just how much language is seeded with visual metaphors. For the anti-ableist, we are supposed to police our speech in ways that would pretend otherwise. (And please read Neil Tennant's obituary for Justin below,*** which speaks to Justin's astonishingly rich ability (not just astonshingly rich for a blind guy, but all the more interesting and impressive since it's a blind guy talking) to describe experiences, such as public street in Indonesia, in visual terms.)
But for the anti-ableist speech policer, we can't say that a good idea is "visionary" because that might have hurt Justin's feelings. No. I reject that. You don't speak for Justin and you have no right to present him as emotionally infantile enough to care about such things.
Following a suggestion from a friend that some of what’s come to light about the roles of the administration and the board in the Salaita affair might not be consistent with accrediting principles regarding shared governance, I decided to check out the specific rules that UIUC is supposed to be operating under.
The upshot of my survey, which I'll explain in detail below, is that UIUC is at least generally bound to respect principles of academic freedom and shared governance by their accreditation regime, and more specifically, that 1) the Board of Trustees is bound to remain free of undue influence by donors and other exteranl parties where this is contrary to the interests of the university, and 2) that the Board and the Administration are bound to let the faculty oversee academic matters. These last two considerations seem to create a real problem given what we now know about the role of external donor pressure on the board and about the way in which the Trustees and the Chancellor seem to have avoided any consultation with the faculty in making the decision to 'dehire' Salaita. (For those who need an update, your best bet is to read Corey Robin's blog, especially this post.)
In constructing the analogy I noted Professor F, like Salaita, had a distinguished academic record, that she worked in a field which often featured polemically charged debates, many of which for her, because of her personal standing and situation–Professor F has very likely experienced considerable sexism in her time–were likely to be charged emotionally, and that a few hyperbolic, intemperate responses, made in a medium not eminently suited to reasonable discourse, and featuring many crucial limitations in its affordance of sustained intellectual engagement, should not disqualify her from an academic appointment made on the basis of her well-established scholarship and pedagogy.
I could very easily have constructed another analogy, using an accomplished professor of African American studies, Professor B, who stepping into the Ferguson debate, after engaging, dispiritingly, time and again, in his personal and academic life, with not just the bare facts of racism in American life and the depressing facts pertaining to informal, day-to-day segregation but also with a daily dose of bad news pertaining to the fate of young black men in America, might finally experience the proverbial last straw on the camel’s back, and respond with a few tweets as follows:
There's not much new that one can currently do to add to the variety of possible time travel metaphysics explored in literature. Try-to-improve-stuff-make-other-stuff-much-worse is a pretty reliable trope by this point. Stephen King's 11/22/63 (in addition to being a great novel) is still philosophically interesting though.
What 11/22/63 makes distressingly clear is that in King's universe Leibniz is correct that this is the best of all possible worlds. When the time-traveller goes back in time the universe supernaturally conspires (through standard horror movie tropes such as inanimate objects behaving as if they are agentive, people acting possessed, etc.) to prevent him from altering anything that would change the course of history or to put history back on track once he's altered it. When the time traveller finally beats the universe at this game, the results are catastrophic, and he then becomes part of the universe horrifically setting itself aright.
For the poor time-traveller it is impossible* to improve history. The thought that one might improve history is only the result of ignorance.
Again, this is not new stuff. It's one response to the problem of evil. What's new in King's book is that he shows Leibniz's conclusion to be itself horrific. It would be madness to worship whatever supernatural forces ensure that this is the best possible world. By realistically portraying the moral psychology of humans buffeted by these forces (made visible to humans by time travel) King is able to demolish the Leibnizian intuition more effectively than Voltaire.
Reformation Theology can be summed up by three Gs: Guilt, Grace, Gratitude. For King, if this really were the best possible world, that's all the more reason to be ungrateful. We're not faced with a good God who can't do better than this (is that process theology?). But rather horrific processes that we can't understand making it impossible that things could be better.
*To be fair, horror impossibility is not the kind of impossibility philosophers normally think about. In horror the impossible is sometimes actual. Noel Carroll comes closest to explaining how this works. Neal Hebert and I tried to expand on Carroll's analysis, but I think our account was overly epistemic. I'm going to teach Harman's book on Lovecraft next year and hopefully be able to rethink this.]
After more than 3,5 years of weekly Brazilian music posts, I decided to discontinue the weekly regularity. For a number of reasons, it had become increasingly difficult for me to keep up with the rhythm (and NOT due to paucity of good Brazilian music!). I may still occasionally post when I come across or am reminded of something particularly inspiring, but it will be a ‘go with the flow’ thing rather than a weekly column.
For this last (regular) BMoF, I’m posting Caetano Veloso’s ‘If you hold a stone’, which I had intended to post at the very beginning of BMoF (at the time, BMoT) back in February 2011. After writing the whole text, I came to realize that the song was not available on youtube (beginner’s mistake…); for whatever reason, now it is. So if you are interested, go back and read what I wrote back then, and listen to the best song about being homesick that I know of -- appropriate for a goodbye, I suppose.
Many thanks to readers who regularly (or irregularly!) followed BMoF over the last 3,5 years; it's been great fun for me, and hopefully a bit fun for you too. Most likely, there will still be some Brazilian music here at NewAPPS, only less frequently.
As someone who has spent the better part of her career researching, analyzing and teaching not only about the structure and nature of oppressive power regimes, but also better and worse ways to resist or transform such regimes, I've nevertheless been unable to settle in my own mind, to my own satisfaction, my position with regard to the moral or political value of revolutionary violence. I can say that my core moral intuitions (for whatever those are worth) definitely incline me toward favoring nonviolence as a principled ethical commitment... though, over the years, I have found those intuitive inclinations fading in both intensity and persuasiveness. As a philosopher, a citizen and a moral agent, I continue to be deeply unsettled by my own ambivalence on this matter.
First, a preliminary autobiographical anecdote: I spent a year between undergraduate and gradate school in the nonprofit sector, as the Director of the M.K.Gandhi Institute for the Study of Nonviolence. (That was back in 2000, when the Gandhi Institute was still housed at Christian Brothers University in Memphis, which is now my academic home, evidencing the kind of bizarro turn-of-fate that can only be credited to some particularly clever-- or ironically humorous-- supernatural bureaucrat.) I went to the Gandhi Institute initially because nonviolence was an all-but-unquestioned moral virtue for me at the time. But, after a few years in graduate school and consistently since, the many and varied until-then-unposed questions about the moral or political legitimacy of violence pressed their way to the fore of my mind. In roughly chronological order, I'd say that the combination of (1) my first real engagement with Frantz Fanon's argument in "Concerning Violence" (from his Wretched of the Earth), the arguments by Marx (and Marxists) in various texts advocating more or less violent revolution, and Noam Chomsky's considerations of the same, (2) my extensive research into human rights violations, transitional justice and transitional democracies, postcolonial theory, feminist theory and critical race theory, which collectively constituted the subject of my dissertation, (3) the radically dramatic shift in what counts as properly-speaking "political" and/or "revolutionary" violence in the post-9/11 world and (4) my own experiences, from near and afar, with the increasing number of (threatened, proto-, aborted, defeated and/or more-or-less successful) revolutions taking place in my adult lifetime (e.g., OWS, the Arab Spring and, much closer to home and far less violent, the current and ongoing academic revolution surrounding the Salaita case), all worked together to contribute to my rethinking the merits and demerits of violence as a way of resisting/combatting/correcting oppressive, exclusionary or otherwise unjust power regimes.
Here is a story of a professor, whose tweets got her into trouble.
The professor in question is a feminist, Professor F, sometimes termed ‘radical’ by her friends, colleagues, and academic foes for her uncompromisingly feminist scholarship and her vigorous, no-nonsense rhetorical style, which is well-versed in the demolition of putative rebuttals to feminist theory and keenly honed by vigorous participation in political polemical debate. She has a distinguished record in scholarship, excellent teaching evaluations–from male and female students alike (though some unkind ones did call her “a typical nut-cracking feminazi”), and found, by dint of these accomplishments, a cohort of scholars and students who admire her work.
She goes to her office, grabs her cup of coffee, and checks into Twitter. Her timeline is abuzz–her friends are talking about the latest case, posting links from journalists and bloggers, all weighing in with their considered opinions. Some even post links from clueless male politicians, offering their usual insensitive, sexist responses to the latest fiasco.
I expect many readers to be following the ongoing debate, prompted by a poll run by Leiter last week, on the (presumed) effects that blogs have had for professional philosophy, both at the level of content and at the level of ‘issues in the profession’. (Roberta Millistein weights in at NewAPPS, and I agree with pretty much everything she says; another summary at Feminist Philosophers.) Now, there is a sense in which I am personally not in a good position to have an opinion on this, simply because I haven’t been around long enough to know what it was like before, and thus to be able to draw an informed conclusion. But I can say that my very process of becoming a professional philosopher (not so much content-wise perhaps, but in terms of deciding on the kind of professional I wanted to be) was considerably influenced by reading in particular the Feminist Philosophers blog. Also, I won’t deny that my career as a whole has tremendously benefited from my blogging activity at NewAPPS and at M-Phi, both in terms of the opportunity to discuss my ideas with a larger number of people than would otherwise have been possible, and (more pragmatically) simply in terms of increased visibility and reputation.
But obviously, my individual experience (or that of other bloggers) is not what is under discussion presently; rather, the question is whether blogs have been good for the profession as a whole. This, however, is obviously a multi-faceted question; it may for example be read as pertaining to the quality of the scholarship produced, to be measured by some suitably ‘objective’ criterion. (As a matter of fact, I do believe that blogs have been ‘good’ for philosophy in this sense, for reasons outlined here for example.) But it may also pertain to the overall wellbeing of members of the profession, in which case the putative effect blogs will have had could be conceived of in terms of a simple formula: for each member the profession (I is the set of all professional philosophers), estimate the net gain (or loss) in professional wellbeing by comparing their situation before (wb) and after (wa) the advent of blogs; add it all up, and divide the total by the number of members considered.
Now I was all set today to work up some speculations on why philosophy is so different from the other humanities and social sciences in this regard (a favorite hypothesis: a disciplinary addiction to the cult of genius plus a high degree of implicit bias in anointing geniuses). Then I went to the Survey of Earned Doctorates to look up some of the raw data. There, I found that the overwhelming whiteness of philosophy is not so unusual among the humanities, if one digs down into the subfield data.
Mark Graber (Law and Government, Maryland) has an interesting post up on the Salaita case and academic culture over at Balkinization. Here's the paragraph that jumped out at me, as I haven't seen this particular point made before:
Each year, more and more pressure seems to be put on faculty to spend less time on traditional forms of publishing and rely more on social media in which significant incentives often exist for vulgar, juvenile, and insulting speech (I’ve never been told I should be especially careful to avoid such temptations). Take a look at the website of many law schools and other academic institutions. Many strongly suggest that the way to gain fame and respect at the institution is through the social media or other outlets where eight second soundbites are norm and footnotes forbidden. More and more of my friends who do traditional, lots of footnotes, scholarship complain that they have fewer and fewer friends (if any) in the administration and they are becoming the first to be asked about buyouts. In short, Salaita strikes me as doing exactly what a great many professors are now doing to get ahead in our professions. Having pressured us to get on the social media, the administrators at our universities can hardly complain if we adopt the conventions of the social media rather than what I think are the better norms of academic discourse.
In other words, and in a perverse sense, Salaita is being unhired for doing precisely what new academic norms and academic institutional imperatives of "relevance" encourage.
Here are some reasons I have found philosophy blogs to be beneficial; here I include New APPS (so my bias is obvious), but my comments here are not limited to New APPS, by any means.
First, while I had heard stories here and there of sexism and exclusion of other underrepresented minorities, they seemed like isolated incidents. One could brush them off as the occasional jerk, and think that we didn't need to worry about them as a profession. Because of philosophy blogs, it has now become abundantly clear that we do need to worry about the treatment and the exclusion of women, people of color, people with disabilities, people who are LGBT, etc. And we have started to see changes in the profession to address these longstanding problems.
Second, philosophy blogs have provided a way to share and discuss issues in the profession that aren't directly related to exclusion, but are important to discuss anyway. My recent post on methods for anonymizing papers is a small, if unexciting example of that.
Third, philosophy blogs have provided a way to disseminate news and information (e.g., statistics).
Last but not least, philosophy blogs aren't just about the profession; they are also about philosophy. They have provided a format for wider dispersal and discussion of philosophical issues. This couldn't have come at a better time; philosophy had been becoming increasingly siloed. I think it's still pretty siloed, but blogging has made connections between philosophers that wouldn't have occurred otherwise and let philosophers know about work that they wouldn't have known about otherwise.
Of course, we can all think of postings on philosophy blogs that we found objectionable/harmful or comments that we found objectionable/harmful. On balance, I still think they have done more good than harm.
When I watch this I realize how stupid I was to take to heart Cobain's* claim that Pearl Jam was the 90's version of Boston. In this performance he's not doing that Jim Morrisony thing that he did in Pearl Jam's first album and that the lead singer of Stone Temple Pilots did too.
It's pretty powerful stuff, and wonderful to watch Roger Waters digging so much Vedder's performance. I'm going to go dig through Peral Jam's back catalog now.
[*DNA testing reveals that Cobain and I share Gospatric, Earl of Northumbria** as an ancestor. All of the extant Cogburns and Cobains descended from him. This isn't actually that weird though; if you go back far enough any two people share a common ancestor. . . It would be nice if this fact led us to treat one another more nicely.
**One of the many lords in Saxon England whom the Normans displaced. The wikipedia article on the "harrying of the north" is pretty good. William the Conqueror was a real bastard. No less an authority than Orderic Vitalis wrote:
The King stopped at nothing to hunt his enemies. He cut down many people and destroyed homes and land. Nowhere else had he shown such cruelty. This made a real change.
To his shame, William made no effort to control his fury, punishing the innocent with the guilty. He ordered that crops and herds, tools and food be burned to ashes. More than 100,000 people perished of starvation.
I have often praised William in this book, but I can say nothing good about this brutal slaughter. God will punish him.
I remain obsessed with the Saxons, the very week they finally defeated the Vikings they were invaded by the Normans. Then a somewhat crap 80's hair metal band was named after them. Can it get much worse?]
[Update 2: The report on which this discussion has been based is now being called into question. UIUC English Professor Ted Underwood tweets as follows: "@Ted_Underwood: Regret to say that last night's report from students appears premature. Faculty have since met with Wise, & report no change in position."]
[Update: Thanks to John Protevi for providing the correspondence address for the UIUC Board of Trustees in the comments below.]
Yesterday evening, reports began to emerge that University of Illinois at Urbana-Champaign Chancellor Phyllis Wise has forwarded Steven Salaita's appointment to the University's Board of Trustees, who will vote on it at their meeting ten days from now, on September 11th.
Obviously, this is a hopeful sign, given that the Chancellor's position until how has been to refuse to submit the appointment to the board—as Corey Robin puts the point, what amounts to a 'pocket veto.' That said, it's difficult to feel too much confidence that the process now underway is intended or should be expected to terminate in the restoration of Professor Salaita's position. Robin has spent some time parsing a couple of scenrios here; but the key thing to recognize, as John Protevi also noticed very quickly last night, is that this could easily be a move that the University is legally required to make or that it would be in its best interest to make if it wants to avoid being sued for denying Salaita due process.
Nevertheless, as Robin points out in his post, these developments also mean that those supporting the causes of academic freedom and faculty governance* in this case now have an important opportunity: ten days in which to bring maximum pressure on the Trustees to vote in favor of Salaita's appointment. In other words, the game is still on, and it must continue to be.
As I write this, at least 543 philosophers have signed our disciplinary pledge to boycott UIUC until this matter is resolved in Salaita's favor—see this post by Eric Schwitzgebel, where he explains his rationale for honoring the boycott.** Please consider adding your name if you have not yet done so. Additionally, please consider writing to the trustees directly expressing your support for Salaita's appointment, as well as your sense of the cost to the Unviersity's reputation should it fail to respect the principles of academic freedom and faculty governance in this case.
An excellent article about Mary Beard, the famous classicist, is in this week's New Yorker. It is informative to have a prominent academic give an account of her life experiences like this. I want to encourage others to read the original article, but will pull out one salient and topical point. Beard is not only a very capable scholar, she is also "an avid user of social media," including regular postings at a blog. Despite the sexist reactions to her online presence, Beard has reacted with surprising generosity and patience: "In another highly publicized incident, Beard retweeted a message that she had received from a twenty-year-old university student: 'You filthy old slut. I bet your vagina is disgusting.'...The university student, after apologizing online, came to Cambridge and took Beard out to lunch; she has remained in touch with him, and is even writing letters of reference for him. 'He is going to find it hard to get a job, because as soon as you Google his name that is what comes up,' she said. 'And although he was a very silly, injudicious, and at that moment not very pleasant young guy, I don’t actually think one tweet should ruin your job prospects.'" Beard is an admirable and remarkable person, and learning about this new side of her makes her all the more so, in my mind. Check it out!
Twenty or so years ago some friends and I voted over and over again to try to get Wodehouse listed in the infamous Modern Library reader's choice of top 100 novels of the twentieth century. Due to our labors, for a week or so "Bertie Wooster Sees it Through" was in the top ten.
Those were good times. In the end though we just couldn't compete with the objectivists, the scientologists, the Heinlein weirdos. . . For what it's worth, we still can't, though I do hope to be able to play all of these songs at some point. That's a little bit of consolation at least.
Exactly 15 years ago today, I arrived in the Netherlands with a suitcase full of dreams (ok, maybe two), ready to start a new phase of my life, but having no idea I'd end up staying for so long. I still do and always will feel a strong bond with my home country Brazil (as BMoF readers of course know!), but looking back on these years, I realize I feel entirely at home here now. Perhaps the main turning point in my relationship with this country was the birth of my children, who were both born here, and who, for all intents and purposes (sadly, including rooting at the World Cup…), are basically Dutch. After they were born, I started feeling a visceral connection with this place, which I didn’t experience before.
However, it is not only because they happened to be born here and have lived here almost all their lives (except for 20 months living in NYC for my older one) that I feel this connection. More importantly, I simply see them happy and thriving, being given all the conditions they need to develop healthily and joyfully, and I am extremely thankful for that. And it’s not only my kids: the Netherlands is consistently ranked as number one at studies comparing the well-being of children in a number of developed countries.
Everyone’s next question is then: what’s the secret? How does one raise the happiest kids in the world? Obviously, the Netherlands is a prosperous country, with levels of social equality only to be compared to those in the Scandinavian countries, and that goes a long way of course. To start with, virtually every child here has access to health care, education, nutrition etc. (Which is not to say that everything is perfect! But even for what is not so good, it’s still probably better than in most other places.) However, there are more factors involved, and on the basis of my experience as a parent I would like to outline two of them.
Since I'm not a naturalist, I'm sort of on Monk/Wittgenstein's side, but I find some of the dichotomies to be a little bit tendentious. Monk opposes "non-theoretical understanding" to the kind of understanding proper to science, and argues that naturalizing programs in philosophy all fail because they don't realize that the domains proper to the two forms of understanding are pairwise disjoint.
Maybe something in the neighborhood of this is true but Monk doesn't mark the distinction in the Sellarsian way one would expect now in terms of the kind of normative presuppositions required by the relevant kind of understanding. Instead, we get this:
One of the crucial differences between the method of science and the non-theoretical understanding that is exemplified in music, art, philosophy and ordinary life, is that science aims at a level of generality* which necessarily eludes these other forms of understanding. This is why the understanding of people can never be a science.
I'm just not sure this is true. It's not at all clear to me that morphologists in biology aim at a greater level of generality than music theorists do.
The slow emergence of the novel as a major literary genre is an ethical event. The novel as a form of literary writing goes back to Greek antiquity, and one novel from antiquity is still widely read, The Metamorphoses of Apulieus (or The Golden Ass by Apulieus). One of the great writers on the form of the novel, Mikhail Bakhtin, even claimed it went back to the Menippean Satire of antiquity. This is probably not one of his most shared ideas. In any case, the idea of a unique moment of origin is not a good basis. There are a series of beginning, which include antique epics, behind the novel as it developed from the sixteenth to nineteenth centuries, when it did become accepted as a literary genre on a level with epic, drama, and lyric poetry.
The modern origin is again ambiguous. Rabelais provides a strong candidate, with major attention coming from Eric Auerbach as well as Bakthin, but Don Quixote is the more widespread object of discussion. Nietzsche refers to it (Genealogy, II.6) as with regard to a change in ideas of humour, so explicitly ethical ideas about where we can find humour. The original readers of Cervantes could laugh without restraint at the suffering of Quixote, and the suffering caused by the ‘ingenious hidalgo’, but Nietzsche suggests that by his time, readers feel unease and even pain themselves at the suffering and humiliation.
I rarely post on hot political topics (unless quantitative analysis of philosophers' lack of diversity counts), but one hot political topic has been very much in my mind this week: the boycott of University of Illinois at Urbana-Champaign. I've been forced to consider the issue especially carefully because I was scheduled to give a talk to the Philosophy Department there in December, and UIUC was starting to invite speakers for a proposed mini-conference on experimental philosophy the next day, where I would give the keynote address.
It is no news to anyone that the concept of consistency is a hotly debated topic in philosophy of logic and epistemology (as well as elsewhere). Indeed, a number of philosophers throughout history have defended the view that consistency, in particular in the form of the principle of non-contradiction (PNC), is the most fundamental principle governing human rationality – so much so that rational debate about PNC itself wouldn’t even be possible, as famously stated by David Lewis. It is also the presumed privileged status of consistency that seems to motivate the philosophical obsession with paradoxes across time; to be caught entertaining inconsistent beliefs/concepts is really bad, so blocking the emergence of paradoxes is top-priority. Moreover, in classical as well as other logical systems, inconsistency entails triviality, and that of course amounts to complete disaster.
Since the advent of dialetheism, and in particular under the powerful assaults of karateka Graham Priest, PNC has been under pressure. Priest is right to point out that there are very few arguments in favor of the principle of non-contradiction in the history of philosophy, and many of them are in fact rather unconvincing. According to him, this holds in particular of Aristotle’s elenctic argument in Metaphysics gamma. (I agree with him that the argument there does not go through, but we disagree on its exact structure. At any rate, it is worth noticing that, unlike David Lewis, Aristotle did think it was possible to debate with the opponent of PNC about PNC itself.) But despite the best efforts of dialetheists, the principle of non-contradiction and consistency are still widely viewed as cornerstones of the very concept of rationality.
However, in the spirit of my genealogical approach to philosophical issues, I believe that an important question to be asked is: What’s the big deal with consistency in the first place? What does it do for us? Why do we want consistency so badly to start with? When and why did we start thinking that consistency was a good norm to be had for rational discourse? And this of course takes me back to the Greeks, and in particular the Greeks before Aristotle.
The guitarist is playing with an "EBow"! Does anybody remember those? They peaked in the 1980s.
I love this song (and this is a credible, if truncated, cover), but I'm kind of glad we're back to using the fingers of our left (right for Cobain, Hendrix, et. al.) hand to get the strings vibrating. I'd bet a decent sum of money that Scott Thurston employed old fashioned volume swells* in the original.
*Cf. Eddie Van Halen's "Cathedral" for a canonical example. Isaac Dinesen/Karen Blixen once wrote something to the effect that you've never truly lived until you've played this solo note-for-note in front of a four dozen or so intoxicated rednecks in a dilapidated rural AlabamaVFW hall.]
So, the response must be multi-faceted. It isn’t enough to feel outrage, but do nothing. Or to feel fear, but do nothing. Or to feel utter, bone-crushing grief, but do nothing. We must institute policies that limit access to guns. Weapons of war have no place in our homes, communities, or law enforcement. But more than that, we as Church must confront the social sin of racism head-on. We must get outside our church buildings, beyond our comfort zones, and say loud and clear, “this is my brother and I will not accept that his life is less valuable than mine. The violence has to stop.” We must be willing to challenge the culture that tells African American boys that their lives are worth less than the lives of White boys. We live in a culture that attempts to justify itself by claiming “self-defense” when we really mean fear and bigotry, or pride, or individualism. But all of this is sin.Our faith reminds us that God is all sovereign and that “God calls us to love our neighbors, not protect ourselves against our neighbors.”
Read the whole thing, which focuses on both gun violence and racism (and concludes with a prayer) here.
Only a couple of weeks after the Ferguson shooting, and only about three miles away, St. Louis police shot and killed another black man, Kajieme Powell, after he apparently shoplifted from a convenience store. The details of what happened in Ferguson are in dispute, which has allowed the law and order crowd to defend putting six bullets into unarmed Mike Brown – two into his head – as a proportional act of self-defense.
No such ambiguity exists in the Powell case. The police released cellphone video yesterday, and it is absolutely chilling. Powell emerges from the convenience store with a pair of canned drinks. He seems a little confused – puts them down, paces around, and so on. Then the police show up in a white SUV, and jump out, guns drawn (already! They decide to escalate before even arriving at the scene). Powell backs away, says “just shoot me” a couple of times, climbs up on a retaining wall, takes a couple of steps in the direction of the police… and then they shoot him dead. Total time between the police arrival and his death? About 15 seconds.
The video, of course, completely contradicts the police department’s story about a drawn knife and aggression on Powell’s part. When confronted with the contradiction, the police chief replied that “in a lethal situation, they used lethal force.” The only thing harder to understand from that video clip than why killing Powell was justified by the situation is how anyone can continue to deny that the problem is structural. I am not accusing the officers or the police chief of lying. It’s much, much worse than that: I’d be pretty sure they really did think their lives were in immediate danger.
Provocative essay here by Charlie Huenemann on how academic philosophy broke bad and what might be done to correct it. Most people that make these kinds of criticisms assume that it would be easy to fix the problems so that all of us could get back to doing old-style philosophy like Plato, Kant, and Hegel did. What's most interesting to me about Huenemann's essay is that he explicitly rejects this assumption.
Huenemann first argues that the modern cult of management in academia brought about a situation where there is:
(1) more attention devoted to narrow problem-solving activity rather then efforts to deepen philosophical wonder; (2) increasingly narrow specialization and less general knowledge of the discipline itself and its history; (3) less engagement with anyone outside the professional guild; and (4) development of various cants and shibboleths to patrol membership in the guild.
There is a lot of wisdom here. However, as noted above, whenever I read this kind of whingeing (and I routinely write it in this forum), I'm almost always struck by the whinger's optimism that there could be any alternative, i.e. if we were all just less narrow we'd be able to do the same kind of stuff that Kant or Schopenhauer did. But is this not exactly like telling a music theory professor that he should compose late period Beethoven quartets and stop with all of the articles on Schenker Analysis? It's a transparently silly demand.