These days I’m trying to think more about how I might better use my imagination, or create something new in the world, or both, not just when I sit down to write, but when I go through my day-to-day life. I’m also listening almost exclusively to The Mountain Goats’ The Sunset Tree, for what it’s worth. I’m reminded of something my friend Peter asked a visiting writer back when we were both teaching in Alabama:
“As an artist, do you feel impelled to act artfully in everything you do?”
Peter mentioned he himself did, and wanted to see how another writer (I’m feeling pretty sure this was a fellow poet) thought about it. I passed it off at the time as poet/artist silliness, but now I see it as a pressing question.
Most of the work I do every day, other than read the books I picked to teach, is to sit with a pencil over double-spaced manuscript pages of 12pt Times New Roman and figure out what to write on it that both expresses my reactions to what I’m reading during the act and also encapsulates my response to what the piece has said or done as a whole. Also: I have to do something with my pencil to help the student learn how to develop either the piece in front of me (in revision) or just personally as a writer (when it comes to write the next thing). I’m not complaining about my job. It’s very hard work.
It’s not creative work. In fact at times it seems like destructive work: Here’s a map of all my confusions about what you’ve done, also lots of corrections of things you neglected to proofread. I do what I can to be encouraging and to point out successes, but even that seems destructive. It emphasizes that the point of having made this thing is to see how teacher responds.
It can easily become work I dread. In fact, right now I’ve got 20 projects to start reading and marking up and rather than get started (and but also to give myself a sense of purpose when I do start) I’m here writing a blog post about it. Anything to get out of doing the actual work. What I dread isn’t my students’ writing—when I get into it their stuff is continually surprising and great and worth sharing—it’s the role I have to step into. It’s not a foreign role. I’ve been doing this for ten years now. But it’s not my role, or it’s not a role I’m eager to understand myself as.
Yesterday on Twitter I asked, “What if marking up student manuscripts were more like a collaboration between two interested writers?” and it got precisely one favorite. (Thanks, Chris!) It’s one way to bring a sense of the creative process to this job of mine. The danger, of course, is for Teacher to lord his creative self all over the student’s work, but this happens all the fucking time in creative writing workshops, doesn’t it?
Here, finally, are some notes on what this theory might look like in practice:
- Understand the student not as a subordinate but as a fellow writer asking for my help and advice as a fellow writer.
- Continue using a pencil, because I will mess up and be wrong in some of my responses.
- Stop correcting grammar and usage and typos, but point it out when it leads to genuine confusion about what’s being said, because I’m not an editor, and I know so many great writers who fuck up in this regard and there’s no correlation between proper grammar and moving art.
- Suggest in notes at the end that the student look up some business of grammar or usage s/he consistently gets wrong, and name a good and useful resource for him or her to do so.
- Point out what I’m jealous of and wish I’d come up with.
- With flat/boring parts, which we all end up with in our drafts, be honest and clear about why I’m being bored and what I’d try as a writer (as opposed to what I need as a writer or what I expect as a teacher; that is: suggest tacks always out of a spirit of creativity and invention).
- Be clear on syllabi about my philosophy or rationale when it comes to responding to manuscripts, and invite students to stop by office hours to collaborate in person and address their concerns, or if that’s not convenient do it over email or via Skype—i.e. be available.
There’s more thinking to be done about this. But I’m done thinking that the chief way students learn to become writers is through the feedback they receive on their manuscripts. I did this last term and I’ll do it every term coming up: assign students a revision plan to be turned in with a draft and you’ll be amazed at how much of “your work” they’ll have done before you even get a chance to.
1 comment ::
It was a wild coincidence that the day I derided what’s become its rallying cry to writers, The Rumpus publishes an essay of mine, but that’s how Monday unfurled.
It’s called “For Lack of Anything Better”, and it’s about yarn art, essay-writing, my mom’s dad, and loss.
A good amount of the behind-the-scenes stuff about writing the essay is in the essay itself—mostly about why I never wanted to write it (because personal essays about this stuff are often such cliches) and then how I tried to and probably failed. I think it’s a good essay, and I’m proud of it, but I don’t know that it’s a good personal essay because of feelings and lyricism.
I read this essay at the Bread Loaf Writers’ Conference this year, and though I had practiced it a half-dozen times in my office, alone, making sure I didn’t go over my 15-minute allotment, being completely fine with the piece as it sounded in my ears, when it came time for me to stand up in front of 100+ people and read it aloud to them, I almost couldn’t do it at the end. I got choked up. I don’t get choked up. Maybe two people have seen me cry. I wasn’t prepared for it at all.
I guess what I want to say here is that this is the first essay I’ve written and released into the world that said something critical about my family I hadn’t already said to them face-to-face. I’m mad, in parts of this essay. When The Rumpus said it wanted to publish the piece I knew I had to talk to my parents about it before what was between us suddenly became everyone else’s.
I sent my folks the essay in an email and heard back from my dad within a couple hours. It was a thoughtful take on the essay, and he told me he was proud of me, and that my mom was crying. But, she said, she always cries when she reads something of mine because it makes her miss me.
I miss my parents, too. When I sent them the email, I made sure to include a line that said something like “I just want you to know that this is a way that I feel about the situation. It’s not the only way, just a way,” because that was the truth. I feel my parents are proud of me, and I feel that they’re ashamed of me. I’m proud of myself, and I’m ashamed of myself. When I got choked up reading this aloud, I was feeling both of those things at once: I was so proud to be getting laughs and attention by that room full of incredible, dedicated writers, and I was so ashamed of what kind of grandson I’d been.
This essay taught me how to write creatively—which is to say fruitfully—about my life, and that’s paratactically, which is another post for a later time.
(You can read it here. Thanks to Mary-Kim Arnold for taking the piece and Paige Russell for the great illustrations.)
That’s the new president of my university: Fr. Paul Fitzgerald. He spoke yesterday with some of the Arts & Sciences faculty. Someone asked him about his teaching philosophy (Fr. Fitzgerald demanded he be given an appointment in the Theology and Religious Studies faculty). This isn’t a direct quote, but here’s what he said:
Number one, you have to love the student. Because if you give love, the student can learn and grow. Otherwise we just end up trying to impress or dazzle them with what we know.
I’m trying to learn these days about love, and this was good advice. It immediately brought to mind some work of contemporary essayists, and also a lot of millennial aphorisms like BE AMAZING and Dear Sugar‘s now proverbial WRITE LIKE A MOTHERFUCKER.
Let’s mince words: a mother fucker is a disgusting person. Why would anybody want to write like one?
No mother fucker fucks out of love.
Before irony and winking became the chief way advertisers tricked you into wanting to buy their products, they relied—ages and ages ago—on rhetoric and persuasion. Like in this Subway ad:
Then came cable TV and MTV and commercials had to compete entertainment-wise with programs and so ads got performative and hammy. Like in this Subway ad:
I think the hammy quotient—and what I mean by this is the exaggerated or overly theatrical quality of the acting—is legible to us post-millennial viewers. There’s something immediately artificial about them. Like, no sandwich artist has or will ever present a sub to us this way:
And this guy is such a risible failure of a punker:
In this way, TV commercials have always been things we laugh at, which was lovely because (as I’m going to keep developing and thinking about in some blog posts to come) what you can laugh at has less power over you.
Lately, though (and yes this isn’t a new phenomenon just newer), TV commercials have become things we’re meant to laugh with. They’re aiming to be funny the way vids people forward from Funny Or Die are. Like this Subway ad from 2014 (which runs twice in the clip, no need to stay for the whole 30 seconds):
The elementary way of talking about what’s happening here is that it’s self-aware hammyness. Meta-ham. I think something more complex is going on. This ad isn’t so much about the hammyness of commercial actors, it deliberately opts for hammyness as a way to keep us entertained—us who have grown immune to hammy actors in commercials.
Here’s the thing, though, these bacon lovers are just as risible and inauthentic as the punker with a burger in his hand nosing that chain-link fence. The way they archly express their love for bacon so widely misses the mark of how performative post-YouTube/rise of foodie culture folks archly express their love for bacon.
I can’t pinpoint yet where the difference lies, and Subway is one example I’m picking on. This is, for certain products aimed at a demographic I’m slowly growing out of (or—horrors!—is this sensibility growing up alongside me?), the norm for selling ads now: a winking irony that claims to vault above advertising’s base cloyingness, but in its failure to spring from anything real ends up reading just as inauthentically as every ad in TV history.
All right, work to do. And don’t worry, I, too, hope I don’t turn into Jon Rosenblatt, 27, a Harvard University English graduate student specializing in modern and postmodern critical theory
3 comments ::
Taught Montaigne for the first time last night, and one thing we talked about was Montaigne’s comfortable doubt, which I got somewhat at in my last post. One student compared him with Tom Wolfe, interestingly. We read The Right Stuff last year, and she wanted to compare his style to Montaigne’s. No doubt there, she pointed out, and it’s true. Wolfe is so certain about his subjects, so emphatic. It’s like night and day.
We thought about other stylists (particularly Gay Talese in the Sinatra profile and Zadie Smith in her “Man vs. Corpse” essay), and there was a consensus that we accept certainty when people are writing about others or the world, but expect doubt when writing about the self. And that the opposite wouldn’t work. Isn’t it weird? You’d think that writing about the self—i.e., writing about the one thing you can claim to have absolute authority over compared to anyone else on the planet—would allow for certainty, but we seemed to think it was poor form.
It is poor form, but why? Was it that Talese has so much detail to deliver through observation, so much of the world he’s been inside to recreate on the page, that we treat his scene-building observation work as certainty and confidence? And thus Montaigne’s world he has to build is that of his mind, which is a murkier and less stable setting? Some students posited that it was a matter of interiority: when you have to re-create the inner thoughts of other people (a la Wolfe) it’s poor form to be unsure and doubtful about what they think and feel, whereas when you delve into yourself you have access to you own unsureness.
I wondered whether it was a result of bifurcating yourself. If you’re only a narrator, you’re a stable entity in your book, and that stability can lead to a steady, certain voice. But as soon as you appear also as a character in a scene, you become two simultaneous people for the reader, often separated by time, and in this splitting of the self that stability is out the window, and so too better be your certainty.
Pet theories to keep developing. Who’s a good uncertain reporter, and who’s a good certain memoirist? Is it possible?
Very Good Paragraphs
I’m teaching some Montaigne essays next week. Reread this passage today, from his “On the education of children”:
Be that as it may; I mean that whatever these futilities of mine may be, I have no intention of hiding them, any more than I would a bald and grizzled portrait of myself just because the artist has painted not a perfect face but my own. Anyway these are my humours, myopinions: I give them as things which I believe, not as things to be believed. My aim is to reveal my own self, which may well be different tomorrow if I am initiated into some new business which changes me. I have not, nor do I desire, enough authority to be believed. I feel too badly taught to teach others.
From Screech’s superior translation. The maddening fact of becoming a creative writing teacher after getting a creative writing degree is that too few of us in graduate CW programs are taught how to teach—creative writing in specific or even just students in general. You have to do a lot of extracurricular work amid your harrowing first job, lest you end up treading the same water you saw your otherwise occupied professors tread.
Also this: I never learned in grad school how to do the work of writing things and more importantly how to enjoy it. How to enjoy the perseverance needed. So again: how do you learn to teach what you yourself weren’t taught? Montaigne: I’m an honest, open model, not a teacher.
You’ve heard the chorus to Joni Mitchell’s “Big Yellow Taxi” before:
Don't it always seem to go
that you don't know what you got till it's gone.
They paved paradise
and put up a parking lot.
It’s always seemed to me a childish, naive complaint. Not because of progress in the free-market enterprise sense, but because of art and culture in the progressivist, futurist sense. I always heard Mitchell’s song in the shadow of a better one, Talking Heads’ “(Nothing But) Flowers”:
(Nothing But) Flowers from Pentagram on Vimeo.
(I was going to link to a performance of this song at the 2010 TED conference with David Byrne, Thomas Dolby, and a string quartet, but at the first chorus the TEDers start clapping along and it’s…. Well you can imagine it’s excruciating.)
The lyrics are clear if you watch the video, but if you’re not an online video-watcher (me neither!) here’s a sampling:
There was a shopping mall
Now it's all covered with flowers
You've got it, you've got it
If this is paradise
I wish I had a lawnmower
You've got it, you've got it
This used to be real estate
Now it's only fields and trees
Where, where is the town
Now, it's nothing but flowers
The highways and cars
Were sacrificed for agriculture
I thought that we'd start over
But I guess I was wrong
We used to microwave
Now we just eat nuts and berries
You got it, you got it
This was a discount store
Now it's turned into a cornfield
You got it, you got it
Don't leave me stranded here
I can't get used to this lifestyle
Byrne’s lyrics are far more evocative, brave, and tragic than Mitchell’s, not only in their specificity (and there’s also just more of them) but in their treatment of the world. Just how, he seems to ask her, are we to live in paradise? What’s a person supposed to even do there?
It’s also why I could never fully enjoy DeLillo’s White Noise (or Cosmopolis, which is guilty of similar missteps): he treats the mess of mass culture with this bewildered Boomer-esque gawking. Every other page he’s all “Can you believe this, folks?” I can believe it, Don. This world Mitchell laments and you shake your head at is the only world I’ve known.
DeLillo is 77. Mitchell is 70. David Byrne’s only 62. It’s not much younger. But for whatever reason he was able to treat the pop of our culture the way Wordworth treated flowers and clouds—as a place where we might find salvation. That’s why he’s been my go-to guy since I wrote my college entrance essay on him.
I can think of a variety of answers.
I. Your Students
Or their parents. If you are the kind of MFA student who has to teach writing or other courses to earn your tuition remission, then it’s the tuition money from your students that, for the most part, pays for you to be there. What does it mean when your students pay for you to do whatever you’re doing for 2, 3, or 4 years? What sort of duty or obligation do you have to those people?
There’s lots of private money in the MFA game. Sometimes this is clear. At Alabama there was all kinds of Truman Capote money floating around. Also the McNair Foundation. When foundations pay for your MFA it’s like you’ve got a patron. What is it like to have, in 2014’s job market, a patron?
There are plenty of people who don’t get private money to fund their degree, or who don’t teach courses paid for by their students, and who thus have to either find the money to afford their education, or take out loans and make plans on how to pay it all back. These people pay their own way. What does it mean in a democracy for someone to pay his or her own way?
Despite what they say about lunch there are lots of things to get for free, and more and more these days I’m thinking that an MFA degree might be one of the worst of them. The strings attached, you see, are like spider silk: you don’t even see anything until you walk face-first into them, and then they’re stuck on you.
At stake here, or maybe just at question, is the writer’s obligations. If it should be to anything but the writing itself, to what, then? To whom? If the MFA is understood to be the start of a career, what does it mean to start that career owing it all to yourself?
2 comments ::
It’s a distinction I have a hard time making. While a quick scan through recent writing projects shows I opt for should over ought, I feel I do the opposite when talking. I feel I ought to say should, but I opt mostly for ought. So I took it, as I ought’ve months ago, to the dictionary to see for sure:
Reserve ought for expressing obligation, duty, or necessity, and use should for expressing suitability or appropriateness.
So I’ve got a new rule of thumb: “should is appropriate,” meaning that when given the choice should not only expresses appropriateness but is pretty much always the more appropriate word than ought.
It’s not that easy, though, in that doing one’s duty is doing what’s appropriate and often vice versa, so I’m afraid I’m going to continue to opt for ought because it sounds smarter and more literate. This is a boring post, I know, but here’s what I’m really trying to get at: I’ll always go for the higher-diction option in this situation. I feel bad when I say who when whom is appropriate, even when usage guides whose authority I trust tell me that whom is pretty much gone from any non-formal use.
Hypercorrecting up might be the clearest marker of pretentiousness. It’s, like, its definition maybe. I’m sure I hypercorrect on ought, and I know I hypercorrect on further v. farther, but I think I’m good at not hypercorrecting to “[X] and I” models when used in non-subjective cases (e.g. My mailman never gave my dog and I much love). I grew up among friends where one’s grammar/usage errors became weapons for others to rhetorically destroy you with, and so it became over time important for me to be right and that importance still lingers well into adulthood. It’s a problem I need to work on harder.
And yet, I don’t know: my voice is mine. Do I sound pretentious in conversation? Probably. Did I get Amazon Reader Reviews on my book that called out its arrogant tone? Yes. Do I have a choice on how I sound? Sure, but I’ve spent so long worrying that I wasn’t coming across the way I needed to in order for others to see me as normal/interesting and like me as a result, and all that worry still hasn’t made me normal or interesting. Not in the way I’d hoped. And I’m getting tired of worrying. It’s really important for me to be right. But it’s also fun to be wrong, I’m slowly learning.
I had two models in mind when I had to start writing the ending to the taxidermy book. One was Eggers’s Frisbee-throwing soaring prose at the end of A Heartbreaking Work, and the other more pressing influence (my book’s practically dripping with it) was Rick Moody’s The Black Veil, with its incantations of black and blacknesses. I guess I wanted a huge buildup of feeling, and then a kind of slap in the face. I’m proud of it, the ending, but I don’t think I want to write those kinds of endings anymore.
Lately I’ve been thinking about this stuff in terms of songs on records. There are songs that are Grand Endings. Radiohead’s “Motion Picture Soundtrack” for instance. Most of Bill Callahan’s final tracks. I interviewed him once, Bill Callahan, and he admitted to being proud of his sequencing on records, and though I don’t think I’ve ever said a bad thing about the guy I’ll say his sequencing is a bit too spot on. It’s so stuffily perfected, like a short story in a literary journal that comes from someone’s MFA thesis.
What’s weird is that Callahan’s got worse at this over the years (cf. Julius Caesar‘s “Stick in the Mud” to Apocalypse‘s “One Fine Morning”) while Radiohead seems to’ve gotten better. This whole idea came to me while listening to Hail to the Thief‘s “A Wolf at the Door” which is a shockingly good final track. It ends the record like a leaf blowing quickly off screen, as opposed to a slow pan skyward or a slow fade to black.
Another model ending: the final shot of Grey Gardens, Little Edie’s face spinning out of the frame. I’m hoping to write more endings like this, ones that sneak up on you and leave you bereft of something. Endings that, if they were a poem, wouldn’t even signal to listeners at your reading that it’s time to sigh audibly.
The best final track ever sequenced is the Pixies’ “Brick is Red” off Surfer Rosa.
In related news, I’m a big fan of the Irish goodbye.