Content

Monday 20 October
Lessons from the New President

Filed under NF

That’s the new president of my university: Fr. Paul Fitzgerald. He spoke yesterday with some of the Arts & Sciences faculty. Someone asked him about his teaching philosophy (Fr. Fitzgerald demanded he be given an appointment in the Theology and Religious Studies faculty). This isn’t a direct quote, but here’s what he said:

Number one, you have to love the student. Because if you give love, the student can learn and grow. Otherwise we just end up trying to impress or dazzle them with what we know.

I’m trying to learn these days about love, and this was good advice. It immediately brought to mind some work of contemporary essayists, and also a lot of millennial aphorisms like BE AMAZING and Dear Sugar‘s now proverbial WRITE LIKE A MOTHERFUCKER.

Let’s mince words: a mother fucker is a disgusting person. Why would anybody want to write like one?

No mother fucker fucks out of love.

 ::  Discuss  ::  2014-10-20  ::  dave

Tuesday 7 October
The New Hammy

Filed under TV

Before irony and winking became the chief way advertisers tricked you into wanting to buy their products, they relied—ages and ages ago—on rhetoric and persuasion. Like in this Subway ad:

Then came cable TV and MTV and commercials had to compete entertainment-wise with programs and so ads got performative and hammy. Like in this Subway ad:

I think the hammy quotient—and what I mean by this is the exaggerated or overly theatrical quality of the acting—is legible to us post-millennial viewers. There’s something immediately artificial about them. Like, no sandwich artist has or will ever present a sub to us this way:

Screen Shot 2014-10-07 at 8.43.47 AM

And this guy is such a risible failure of a punker:

Screen Shot 2014-10-07 at 8.43.58 AM

In this way, TV commercials have always been things we laugh at, which was lovely because (as I’m going to keep developing and thinking about in some blog posts to come) what you can laugh at has less power over you.

Lately, though (and yes this isn’t a new phenomenon just newer), TV commercials have become things we’re meant to laugh with. They’re aiming to be funny the way vids people forward from Funny Or Die are. Like this Subway ad from 2014 (which runs twice in the clip, no need to stay for the whole 30 seconds):

The elementary way of talking about what’s happening here is that it’s self-aware hammyness. Meta-ham. I think something more complex is going on. This ad isn’t so much about the hammyness of commercial actors, it deliberately opts for hammyness as a way to keep us entertained—us who have grown immune to hammy actors in commercials.

Here’s the thing, though, these bacon lovers are just as risible and inauthentic as the punker with a burger in his hand nosing that chain-link fence. The way they archly express their love for bacon so widely misses the mark of how performative post-YouTube/rise of foodie culture folks archly express their love for bacon.

I can’t pinpoint yet where the difference lies, and Subway is one example I’m picking on. This is, for certain products aimed at a demographic I’m slowly growing out of (or—horrors!—is this sensibility growing up alongside me?), the norm for selling ads now: a winking irony that claims to vault above advertising’s base cloyingness, but in its failure to spring from anything real ends up reading just as inauthentically as every ad in TV history.

All right, work to do. And don’t worry, I, too, hope I don’t turn into Jon Rosenblatt, 27, a Harvard University English graduate student specializing in modern and postmodern critical theory

3 comments  ::  Discuss  ::  2014-10-07  ::  dave

Thursday 25 September
What Montaigne Taught Us

Filed under NF

Montaigne man … Cohen is clearly a descendant of the great French essayist.Taught Montaigne for the first time last night, and one thing we talked about was Montaigne’s comfortable doubt, which I got somewhat at in my last post. One student compared him with Tom Wolfe, interestingly. We read The Right Stuff last year, and she wanted to compare his style to Montaigne’s. No doubt there, she pointed out, and it’s true. Wolfe is so certain about his subjects, so emphatic. It’s like night and day.

We thought about other stylists (particularly Gay Talese in the Sinatra profile and Zadie Smith in her “Man vs. Corpse” essay), and there was a consensus that we accept certainty when people are writing about others or the world, but expect doubt when writing about the self. And that the opposite wouldn’t work. Isn’t it weird? You’d think that writing about the self—i.e., writing about the one thing you can claim to have absolute authority over compared to anyone else on the planet—would allow for certainty, but we seemed to think it was poor form.

It is poor form, but why? Was it that Talese has so much detail to deliver through observation, so much of the world he’s been inside to recreate on the page, that we treat his scene-building observation work as certainty and confidence? And thus Montaigne’s world he has to build is that of his mind, which is a murkier and less stable setting? Some students posited that it was a matter of interiority: when you have to re-create the inner thoughts of other people (a la Wolfe) it’s poor form to be unsure and doubtful about what they think and feel, whereas when you delve into yourself you have access to you own unsureness.

I wondered whether it was a result of bifurcating yourself. If you’re only a narrator, you’re a stable entity in your book, and that stability can lead to a steady, certain voice. But as soon as you appear also as a character in a scene, you become two simultaneous people for the reader, often separated by time, and in this splitting of the self that stability is out the window, and so too better be your certainty.

Pet theories to keep developing. Who’s a good uncertain reporter, and who’s a good certain memoirist? Is it possible?

 ::  Discuss  ::  2014-09-25  ::  dave

Thursday 18 September
Very Good Paragraphs – Teaching Edition

Filed under Very Good Paragraphs

I’m teaching some Montaigne essays next week. Reread this passage today, from his “On the education of children”:

Be that as it may; I mean that whatever these futilities of mine may be, I have no intention of hiding them, any more than I would a bald and grizzled portrait of myself just because the artist has painted not a perfect face but my own. Anyway these are my humours, myopinions: I give them as things which I believe, not as things to be believed. My aim is to reveal my own self, which may well be different tomorrow if I am initiated into some new business which changes me. I have not, nor do I desire, enough authority to be believed. I feel too badly taught to teach others.

From Screech’s superior translation. The maddening fact of becoming a creative writing teacher after getting a creative writing degree is that too few of us in graduate CW programs are taught how to teach—creative writing in specific or even just students in general. You have to do a lot of extracurricular work amid your harrowing first job, lest you end up treading the same water you saw your otherwise occupied professors tread.

Also this: I never learned in grad school how to do the work of writing things and more importantly how to enjoy it. How to enjoy the perseverance needed. So again: how do you learn to teach what you yourself weren’t taught? Montaigne: I’m an honest, open model, not a teacher.

 ::  Discuss  ::  2014-09-18  ::  dave

Thursday 18 September
Paradise vs. Parking Lots

Filed under music

You’ve heard the chorus to Joni Mitchell’s “Big Yellow Taxi” before:

Don't it always seem to go
that you don't know what you got till it's gone.
They paved paradise
and put up a parking lot.

It’s always seemed to me a childish, naive complaint. Not because of progress in the free-market enterprise sense, but because of art and culture in the progressivist, futurist sense. I always heard Mitchell’s song in the shadow of a better one, Talking Heads’ “(Nothing But) Flowers”:

(Nothing But) Flowers from Pentagram on Vimeo.

(I was going to link to a performance of this song at the 2010 TED conference with David Byrne, Thomas Dolby, and a string quartet, but at the first chorus the TEDers start clapping along and it’s…. Well you can imagine it’s excruciating.)

The lyrics are clear if you watch the video, but if you’re not an online video-watcher (me neither!) here’s a sampling:

There was a shopping mall
Now it's all covered with flowers
You've got it, you've got it

If this is paradise
I wish I had a lawnmower
You've got it, you've got it

...

This used to be real estate
Now it's only fields and trees
Where, where is the town
Now, it's nothing but flowers
The highways and cars
Were sacrificed for agriculture
I thought that we'd start over
But I guess I was wrong

...

We used to microwave
Now we just eat nuts and berries
You got it, you got it

This was a discount store
Now it's turned into a cornfield
You got it, you got it

Don't leave me stranded here
I can't get used to this lifestyle

Byrne’s lyrics are far more evocative, brave, and tragic than Mitchell’s, not only in their specificity (and there’s also just more of them) but in their treatment of the world. Just how, he seems to ask her, are we to live in paradise? What’s a person supposed to even do there?

It’s also why I could never fully enjoy DeLillo’s White Noise (or Cosmopolis, which is guilty of similar missteps): he treats the mess of mass culture with this bewildered Boomer-esque gawking. Every other page he’s all “Can you believe this, folks?” I can believe it, Don. This world Mitchell laments and you shake your head at is the only world I’ve known.

DeLillo is 77. Mitchell is 70. David Byrne’s only 62. It’s not much younger. But for whatever reason he was able to treat the pop of our culture the way Wordworth treated flowers and clouds—as a place where we might find salvation. That’s why he’s been my go-to guy since I wrote my college entrance essay on him.

 ::  Discuss  ::  2014-09-18  ::  dave

Tuesday 9 September
Who Pays for the MFA?

Filed under Endorsements

I can think of a variety of answers.

I. Your Students
Or their parents. If you are the kind of MFA student who has to teach writing or other courses to earn your tuition remission, then it’s the tuition money from your students that, for the most part, pays for you to be there. What does it mean when your students pay for you to do whatever you’re doing for 2, 3, or 4 years? What sort of duty or obligation do you have to those people?

II. Foundations
There’s lots of private money in the MFA game. Sometimes this is clear. At Alabama there was all kinds of Truman Capote money floating around. Also the McNair Foundation. When foundations pay for your MFA it’s like you’ve got a patron. What is it like to have, in 2014’s job market, a patron?

III. You
There are plenty of people who don’t get private money to fund their degree, or who don’t teach courses paid for by their students, and who thus have to either find the money to afford their education, or take out loans and make plans on how to pay it all back. These people pay their own way. What does it mean in a democracy for someone to pay his or her own way?

Despite what they say about lunch there are lots of things to get for free, and more and more these days I’m thinking that an MFA degree might be one of the worst of them. The strings attached, you see, are like spider silk: you don’t even see anything until you walk face-first into them, and then they’re stuck on you.

At stake here, or maybe just at question, is the writer’s obligations. If it should be to anything but the writing itself, to what, then? To whom? If the MFA is understood to be the start of a career, what does it mean to start that career owing it all to yourself?

2 comments  ::  Discuss  ::  2014-09-09  ::  dave

Monday 8 September
Should vs Ought

Filed under Grammar/Usage Nerdery

lionrideshorseIt’s a distinction I have a hard time making. While a quick scan through recent writing projects shows I opt for should over ought, I feel I do the opposite when talking. I feel I ought to say should, but I opt mostly for ought. So I took it, as I ought’ve months ago, to the dictionary to see for sure:

Reserve ought for expressing obligation, duty, or necessity, and use should for expressing suitability or appropriateness.

So I’ve got a new rule of thumb: “should is appropriate,” meaning that when given the choice should not only expresses appropriateness but is pretty much always the more appropriate word than ought.

It’s not that easy, though, in that doing one’s duty is doing what’s appropriate and often vice versa, so I’m afraid I’m going to continue to opt for ought because it sounds smarter and more literate. This is a boring post, I know, but here’s what I’m really trying to get at: I’ll always go for the higher-diction option in this situation. I feel bad when I say who when whom is appropriate, even when usage guides whose authority I trust tell me that whom is pretty much gone from any non-formal use.

Hypercorrecting up might be the clearest marker of pretentiousness. It’s, like, its definition maybe. I’m sure I hypercorrect on ought, and I know I hypercorrect on further v. farther, but I think I’m good at not hypercorrecting to “[X] and I” models when used in non-subjective cases (e.g. My mailman never gave my dog and I much love). I grew up among friends where one’s grammar/usage errors became weapons for others to rhetorically destroy you with[1], and so it became over time important for me to be right and that importance still lingers well into adulthood. It’s a problem I need to work on harder.

And yet, I don’t know: my voice is mine. Do I sound pretentious in conversation? Probably. Did I get Amazon Reader Reviews on my book that called out its arrogant tone? Yes. Do I have a choice on how I sound? Sure, but I’ve spent so long worrying that I wasn’t coming across the way I needed to in order for others to see me as normal/interesting and like me as a result, and all that worry still hasn’t made me normal or interesting. Not in the way I’d hoped. And I’m getting tired of worrying. It’s really important for me to be right. But it’s also fun to be wrong, I’m slowly learning.

Footnotes    (↵ returns to text)
  1. Despite what you may have been told or picked up, it’s 100% a-okay to end sentences in English with prepositions. It’s not grammatically possible to do this in Latin, however, and so where this non-rule comes from is like 100-year-old attempts on the part of misguided philologists to make English operate more like Latin, which they’d thought to for whatever reason be ideal and perfected. Ditto with the split infinitive.

 ::  Discuss  ::  2014-09-08  ::  dave

Friday 29 August
Endings of Things

Filed under Endorsements

Irish-goodbye_thumbI had two models in mind when I had to start writing the ending to the taxidermy book. One was Eggers’s Frisbee-throwing soaring prose at the end of A Heartbreaking Work, and the other more pressing influence (my book’s practically dripping with it) was Rick Moody’s The Black Veil, with its incantations of black and blacknesses. I guess I wanted a huge buildup of feeling, and then a kind of slap in the face. I’m proud of it, the ending, but I don’t think I want to write those kinds of endings anymore.

Lately I’ve been thinking about this stuff in terms of songs on records. There are songs that are Grand Endings. Radiohead’s “Motion Picture Soundtrack” for instance. Most of Bill Callahan’s final tracks. I interviewed him once, Bill Callahan, and he admitted to being proud of his sequencing on records, and though I don’t think I’ve ever said a bad thing about the guy I’ll say his sequencing is a bit too spot on. It’s so stuffily perfected, like a short story in a literary journal that comes from someone’s MFA thesis.

What’s weird is that Callahan’s got worse at this over the years (cf. Julius Caesar‘s “Stick in the Mud” to Apocalypse‘s “One Fine Morning”) while Radiohead seems to’ve gotten better. This whole idea came to me while listening to Hail to the Thief‘s “A Wolf at the Door” which is a shockingly good final track. It ends the record like a leaf blowing quickly off screen, as opposed to a slow pan skyward or a slow fade to black.

Another model ending: the final shot of Grey Gardens, Little Edie’s face spinning out of the frame. I’m hoping to write more endings like this, ones that sneak up on you and leave you bereft of something. Endings that, if they were a poem, wouldn’t even signal to listeners at your reading that it’s time to sigh audibly.

The best final track ever sequenced is the Pixies’ “Brick is Red” off Surfer Rosa.

In related news, I’m a big fan of the Irish goodbye.

 ::  Discuss  ::  2014-08-29  ::  dave

Sunday 3 August
Hack Tweets

Filed under Comedy

Among other things going on in our apartment, we’re watching CMT’s broadcast of the Starsky & Hutch remake. I thought about tweeting this:

It seems the Ben Stiller character just messed things up irreparably. (Watching every Ben Stiller movie at once.)

It was unsatisfactory, because this is just a sarcastic and convoluted way of saying “Man, Hollywood movies are so formulaic.” So then I came up with this one:

This year for Halloween I’m going as that moment when the Ben Stiller character messes everything up irreparably.

More dissatisfaction. I mean, it performs a kind of cleverness in how it turns something abstract (hackneyed film trope) into something concrete (costume), and there’s a kind of surrealness to the tweet that (at least initially) feels nice. But that surrealness is just trumped-up artifice, and that’s why I think it’s a lousy joke tweet.

Then, when trying a third time, I realized why I wasn’t going to succeed: this is a tweet about watching TV and feeling like I’m smarter than the TV I’m by choice here watching.

This morning I read much of Mike Sacks’s And Here’s the Kicker, a collection of interviews with comedy writers. There were a number of refrains among these men (and two women) when it came time to give budding comedy writers advice, but the one that stands out now is how many people urged writers to get out in the world and write about what they find. That too often writers write jokes about the kinds of jokes they’ve seen before and know are funny.

It’s how I tend to tweet.

Twitter is neat, but too often it becomes a tool to socially enhance our (mostly) solitary TV watching. This is not the same as being social.

 ::  Discuss  ::  2014-08-03  ::  dave

Monday 21 July
Modelo: A Beer for Hyper-Insecure Boys

Filed under TV

I mean where do I start with this one?

I don’t need to point out what’s so odious about this one, but can you imagine how insufferable the kind of guy would be who calculates his every move from the time he enters a new bar?

What’s interesting here is that Modelo, a relatively shitty, low-rent beer, has invested some money in an advertising company that employs very smart people to help make it the new Tecate, which seems to’ve become the west-coast PBR owing to being inexpensive and never advertising. So Modelo might not be so smart, but these people they’ve hired? Very smart people. All commercials operate off our fears and anxieties, and nothing scares a twentysomething hipster more than not being cool. Or, more exactly, not being seen by others as cool.

I was always a nerd. I don’t remember how or when I learned that being an adult meant no longer needing to care what other people thought about me, but this is what I had faith in growing up. I understand worrying about whether you smell, or are pretty. But worrying about whether strangers in a bar you’ve never been inside think you have good taste? It’s maybe the definition of the hipster.

2 comments  ::  Discuss  ::  2014-07-21  ::  dave

teaching
2014-07-09 :: dave
teaching
2014-06-26 :: dave
TV
2014-06-16 :: dave
Very Good Paragraphs
2014-06-14 :: dave
TV
2014-06-13 :: dave
Uncategorized
2014-06-09 :: dave
Books
2014-06-06 :: dave
music
2014-03-27 :: dave
Announcements
2014-01-30 :: dave
NF + Very Good Paragraphs
2014-01-21 :: dave