This month I planned to write a column in response to the Shonda Rhimes/Alessandra Stanley kerfluffle, which revolved around Stanley’s much-maligned (and deservedly so) backhanded categorization in The New York Times of one of Hollywood’s most successful TV producers as “an angry black woman.” This resurrection of the trope of the angry [insert minority identity marker here] in popular discourse got me thinking about the figure of the angry gay white man that was so much a part of the New York social landscape during my early years in the city—years that overlapped with the worst of the AIDS crisis of the late ’80s and early ’90s. Larry Kramer, David Wojnarowicz, Harry Kondoleon and so many others made enraged and engaged performances about the horrors engulfing the queer community then — horrors willfully ignored by the larger culture.
But ever since an HIV diagnosis stopped being an automatic death sentence in the mid-’90s, and ever since the Will and Grace paradigm of the squeaky-clean homo became ascendant, the image of grungy, leather-jacketed ACT UP queens has been expunged from our collective memory banks. At the very worst, the fictional gay man of today’s popular culture might throw a mild hissy fit over a mis-accessoried ensemble. Rage is just not a part of the meme anymore.
So I was going to write about this process of “eragure” until I happened upon Adam Epstein’s post here on the Clyde Fitch Report—Hello Broadway, I’m 40 and I Must Be Going—and thought to myself “Why blog about it when you can be it?” Because the initial effect on me of Epstein’s post was to get my white-trash-fag’s Irish up, if I may mix my slurs. As someone toiling in the trenches of the New York theatre for nearly 25 years, I felt there were certain assumptions made in Epstein’s post that needed challenging in a spirit of collegial discussion. So after much deep breathing and a few rounds of kundalini yoga, I now calmly offer the following.
First, let me be very clear that anyone grappling with failure, real or perceived, has my fullest sympathy. I wish Adam all the best as he figures out his next move. Second, to the best of my knowledge, he and I have never met, so there is no personal animus here nor am I privy to the details of his New York experiences that caused him to decamp to L.A. and then London. Finally, the show that he won his Tony Award for, Hairspray: The Musical, happens to be one of my favorite commercial musicals so far this century. (Coincidently, I directed and choreographed a production in 2011 for Lehman College in the Bronx. Thanks to the wonders of the YouTube, I dug up the enclosed video showing the adorable Jean Paul Morales singing “It Takes Two.”) As far as Broadway properties go, I think he’s got excellent taste.
But it would appear that for Epstein, Hairspray may be the limit of his taste, and that’s unfortunate but not unusual. Throughout his post, ideas of “theater,” “New York” and “Broadway” seem conflated in a reductive ontological exercise—all theater that exists is in New York; all New York theater that exists is on Broadway. Maintaining this fiction in popular discourse serves powerful monied interests by a) ensuring that the little coverage mainstream media gives to theater is primarily focused on the Great White Way, which in turn influences leisure-time purchasing decisions and drives ticket sales, and b) serving as an inflationary pressure on ticket prices to Broadway shows. After all, if the Broadway brand is the “real” product, it must be worth significantly more than anything “not Broadway.”
The reality is that theater in America, including theater in New York, is vast and diverse. Innovative companies flourish in every major city in the nation — Pig Iron Theatre in Philadelphia, The Neo-Futurists in Chicago, to name but two. Even by the limited standards of theater considered commercially viable, regional companies may claim two recent recipients of the Tony for Best Play: August: Osage County (Chicago’s Steppenwolf) and All the Way (Oregon Shakespeare Festival). And while there is a necessary conversation to be had in the institutional theater about a not-for-profit model in which regional or downtown houses are valued only insofar as they either develop projects that Broadway can monetize or serve as a source of ongoing royalty revenue for past Broadway shows, the fact remains that the true locus of creativity in the American theater is neither the Shuberts nor Disney. It’s in the nonprofit sector, particularly the sector of that sector that continually risks its own existence on new material, on non-famous artists and on challenging subject matter.
Beyond material concerns, if the definition of “success” is predicated solely on having hit Broadway shows, then, yeah, the vast majority of us are failures walking around with “thwarted ambitions.” This kind of thinking is damaging, however, because it perpetuates a success narrative that devalues and delegitimizes vital work in all sectors of theatrical production. People are attracted to, and dedicate their lives to, the theatre for all sorts of reasons, not just fame and money — fostering community; serving educational needs; having a forum in which to discuss political and social issues — and they influence the kind of theater that many artists choose to make. In these milieus, benchmarks for success are not tied to box-office receipts. In my own case, I know that my college-level production of Hairspray had a profound effect on the young people who took part in it because they told me it did, and I’m very proud of the work the cast and designers did on a show with very limited resources. While this Hairspray had no impact beyond the Bronx, certainly wasn’t going to win any awards, and didn’t pay enough for me to quit my day job, I’ll always treasure it as one of the most successful endeavors of my life.
I also feel there is an unspoken ageism at work in Epstein’s post. It’s hardly insightful to point out that our culture is obsessed with youth, and it’s really obsessed with young people who earn obscene amounts of money and become ridiculously famous by the age of 30. The assumption is that if you haven’t “made it” by 40, then you never will. Game over. Thanks for playing. Sorry, you lost.
In a recent essay in the Atlantic, Ezekiel Emanuel, the head of the Clinical Bioethics Department at the National Institutes of Health and a professor at the University of Pennsylvania, used recent research into aging and the brain to argue that human creativity peaks at 40; it’s basically a long, slow slide into senility after that. Emanuel acknowledges that there are late-blooming exceptions to this process, but that the data clearly show the pattern that the majority of people will experience. Maybe I’m one of his outliers (or maybe I’m delusional), but I feel that at 45, I’m just getting started. Ibsen didn’t have his great creative breakthrough with A Doll House until he was 51; Matisse began making his cutouts (the subject of a current MOMA exhibition) when he was 72; Verdi’s two great Shakespearean masterpieces, Otello and Falstaff, premiered when he was 73 and 79, respectively. When I struggle with my own sense of failure, I keep these greats in mind. I also remind myself that many of the AIDS generation of angry white men didn’t even make it to 45. Or 40. I owe it to their memory to not give up. More on that in the next column.