What is the beauty in the written word? I write this question on a summer afternoon, on a park bench. I had grown accustomed to being one of a few inhabitants at this park, and the only one reading/writing. But today I had difficulty finding a seat at all; around each corner, the next bench was occupied by an adolescent with a splayed book in lap. This solitary activity was excusable when I was doing it, but unbearably self-centered, solipsistic, when others were–particularly when they were in my desired seat.
All of which made me think: what makes reading/writing more desirable than oral conversation? I had in my defense the fact that in this time of quarantine I was in a new city with few friends, so I was reduced to communing with the world through the letter. But am I being unfair in even posing this question; does the question privilege oral conversation, an inter-face, over written dialogue?
Perhaps due to my current condition of isolation, I imagine being invited to a university as a guest speaker or scholar-in-residence. (Please bear with the narcissism.) The typical way of handling such conferences is not dissimilar to the history of preaching: a time of solitary reading/contemplation followed by the performance of that contemplation’s results before others. There is something to be said for the beauty of imagining, in silence, what others need to hear. But liturgists have traditionally played the pastoral role of listening to parishioner’s concerns, secrets, guilt. Hence, the performance is informed not only by the universal word of a Holy Book or of classical texts, but by whispering hearts. The task of the preacher is to make one’s performance personal enough to be meaningful, but without addressing such a specific audience that one delimits the accessibility of one’s performance, and without betraying others’ privacy. In homage to this pastoral tradition, I imagine as an avant-garde scholar-in-residence that I would refuse to read or write at all. What would happen if one were to be forced to dictate one’s thoughts (and we should be grateful for the input of the secretary/amanuensis/translator), and if one were forced to get one’s sources from people themselves rather than books?
I can imagine the fallibility in such an approach: the well-intentioned desire to communicate with others (and how many “scholars” have the requisite amicability even for this?) may be rightfully met with coldness and confusion on the part of one’s would-be interlocutors. Who is the scholar, to demand trust or time, even if the attempt at hand is to offer these very things? Such a project might warrant the sharp critique that it would become so personal as to lose any scholastic value: one might as well go around giving one person after another a back-rub. And why not? Maybe this would do more work than the scholarly presentation with which we are so comfortable, given that knowledge is an embodied practice. But could we bear this re-experimentation with the royal touch for our spiritual scrofula, or would we demonize this new class of celebrity masseuse, as we already demonize sex workers?
What makes us so sure that writing, particularly fictional writing, supersedes such simple pleasures as touch? Surely, the realms of writing and touch are already brought together in the desire for an autograph: among those whose bodies are marked by the celebrity’s signature, among those concert-goers who reach out to the stage, among those fans who have successfully made contact with their idols, and exclaim: “I’m not going to wash this hand!”
The question remains: what is the beauty of the written word?
Pertaining to the use of a “back-rub” as a disparaging moniker for overly “personal” work, I think also of several occasions upon which elder male social scientists referred to interdependence with the flippant phrase “you-scratch-my-back-I-scratch-yours.” As an example, picture yourself in a lecture hall, perhaps in an introduction to legal theory or to economics, where you hear: “A producer is rarely ever able to produce all the goods one requires. Hence exchange. Or, to put it differently: You-scratch-my-back-I-scratch-yours.”
Indeed, one finds that one’s back, one’s blindspot, is nearly always used as the point of reference for one’s dependence upon others; it is that locus of the body which is ultimately one’s own but which regularly exceeds one’s capacity for control and surveillance. Because I do not fully own it myself, I must ask: “who has got my back?”
While the familiar “you-scratch-mine-I-scratch-yours” is intended to demonstrate the quid pro quo nature of human affairs, while the phrase is oft-expressed with the tone of someone not unfamiliar to under-the-table deals and mafioso-type extortions, there remains something profoundly intimate within the phrase. The line may even be delivered in a coercive tone, depending on how sinister the context of this offer-not-to-be-refused, and still the mere mention of scratching one’s back upends the masculinist rhetoric, flusters any attempt to be intimidating; moreover, it satirizes the idea of independent individuals rationally engaging in a system of contractual exchange. The economist for whom behaviors are representable as discrete lists of options, as tables summarizing revenue and expenditure, even as those graphs which capture in neat lines every possible preference–such an economist betrays (this despite himself, right under his nose) an intimacy in the very phrase he has chosen. This intimacy becomes all the more apparent after a few tweaks, whereby I-scratch-yours-you-scratch-mine becomes: “please, will you scratch my back?”
Perhaps I am to read this event of back-scratching as a hygienic symbiosis–that is to say, back-scratching does not represent the profound need of living beings to touch and be touched; rather, it represents the unfortunate tendency for dirt to stick in those crevices which evade our reach and inspection. Perhaps you-scratch-mine-I-scratch-yours is an evolutionary symbiosis of no further interest than the event whereby an Egyptian plover becomes the croc’s dental hygienist. “Thank you for your service; your check’s in the mail.”
But I would rather imagine a white male educator who derives a great deal of pleasure from developing his slides, his contemporary examples of abstract principles, his quirky extra-credit questions (What color tie was the professor wearing during the review session? [2 points]) An educator whose pride stems from witnessing what he perceives to be his students’ growth in potential–even if the change in students’ behavior more often reflects their better capacity to mete his demands and match his vocabulary, rather than establishing some abstract potential which can be applied to any environment or market. I would further like to imagine that this educator has a distanced involvement in other people’s lives. That he shed tears on relatively few occasions, including: his wedding, the births of his two children, and the death of his mother. (He felt strangely calm at his father’s funeral, a few years later, perhaps because he had already experienced the death of a parent, perhaps because he was never as deeply attached to Father.) That, on the way to his own office, he passes by a photocopy of Michelangelo’s Pietà outside a colleague’s door. That he inexplicably thinks of the phrase “you-scratch-mine” whenever he sees this image, thinks of the image whenever he uses the phrase in lecture. Perhaps both image and phrase appear as the two remaining guests before last call, when his mind eventually retires to the tides of sleep: you scratch mine.
What can one understand in imagining such a person’s desire, if not the slip ‘twixt lip and cup, whereby language fails to ever satisfy needs that exceed a sonic or graphic form? What is writing, and what have I done by writing so much about my imagined character?
Of course, if we are to credit the written dialogue as more than a quiet version of conversation, there appear reasons for reading/writing which are more than just a (socially) neurotic’s substitute. In some ways, the traveling masseuse/therapist may reach more profoundly into the tissues of one’s being, but it is also true that this reliance on proximity and touch may make one’s impact more glancing.
The written form is a monologue. For avid readers, it’s a faster expression than a one-“man” play. (The reference to one “man” is intentional, because, for most of history, the monologue has been the dick’s monologue.) So writing, then, marks a period of uninterruption, requires–in the words of Virginia Woolf–a room of one’s own.
Who cannot relate to those conversations–often late at night, perhaps fueled by drugs or alcohol–which circle forgetfully around the same few questions and offer unwittingly contradictory responses? Hegel compares the amateur attempt at philosophizing to an argument between children who compete under the ruse of reason, when in fact their motivating principle is “whatever I am saying now is correct,” when their goal is to be the most recent to have spoken. It would not do to say that people are merely childish, exploitative or self-centered; what is relevant here is the event whereby forgetfulness overtakes intention. The condition is like that of an amnesiac held captive in a room without a lock.
What I’m trying to say is that the experience of the written word differs from conversation in that it holds a more particular relation to memory.
Imagine the following scenario: You are a counselor, and you have a client who is kind, respectful, and relatively mature; however, she suffers from severe anxiety. Her interactions indicate that she is often a comfort to others, but rarely does she express confidence or self-admiration (rarely is she a comfort to herself, assuming that self-soothing is possible). As her counselor, you recommend that she conduct an experiment in automatic writing, that she not delete or edit as the words fall on the page.
It would be easy to say that this is a mechanism by which some internal reality, once the lid is loosened, makes itself manifest. This explanation has been widely used, even as questions remain as to how precisely one opens oneself up to the repressed. (After all, isn’t your patient being asked to do this “opening up” in her writing, without your having explained how it is to occur?) So the result is not some sudden explosion of dormant feeling. Rather, your patient either invents the pathological writing to please you–the true placebo-, the I-will-please- effect. Or, I could even grant that your patient does have these thoughts independently of your interference, but that she has them in such a way that her noetic self can forget, in such a way that she regards as infrequent. As the therapist, you then either linger on the self-abusive passages (distending them in their written form so that they can be re-cognized) such that the client no longer regards these episodes of self-abuse as negligible; or, you remind the client (from your perspective as someone who encounters many clients) that her occasions of self-critique exceed the norm.
Through the diary task, you induce the client to writing such that experiences become memorable; and, in so doing, you lower her defenses against them. The relevance of this specific case to writing in general is that the patient/client dyad and the writer/editor work to induce or reduce memory.
We might also say that there is an active form of memory which we gain during reading, and which culminates at the end of a sentence. Age and youth are useful categories not only for comparing people, but also for comparing variations in a single person’s subjectivities. Even the oldest, most wisened person enjoys a special kind of youth when approaching the first word in a sentence. My hypothesis is that speech (and I mean ex tempore speech, not the lecture as a form of reading) not only has this youth as a characteristic, but requires it. To use Gadamer’s terms, we might refer to the ontological character as the fore-knowing of projection, a projection which is continuously disrupted when reading. Such youth is generative for the possibility of making insights, but often risks repetition.
Is it that we are always both young and old, both looking at what came before and predicting what comes after? This hypothesis seems so ecumenical–it’s both, silly!–as to be useless. One might rather establish an asymmetry between the ontologies of past and projection. Possibly the past is the direction which one can never face: even when attempting a retrieval/check, one is still in the modality of prediction with respect to what one hopes or expects to find. As such, I think that youth and wisdom never occur simultaneously, but instead oscillate. It seems that our elder selves are constitutively distinct from our seeking, youthful selves; a moment occurs when we are no longer reaching but are now holding, now comparing expectation and held object, as we might compare two swathes of the same color.
Which requires effort: inquisitive youth or synthesizing age? Is it that both youthful and aged selves undergo a qualitative change with the right impetus, or is that our tired selves are more inquisitive than synthetic? It is likely that our synthetic selves rarely appear, and require great effort. Think about how many stages of sublimation one must undergo to form a valuable question. Relating this active form of memory to the task of writing, we might note that writing permits re-vision, and–in this way–its relation to the synthetic self is clear.
But memory, alone, cannot explicate the event of writing. The second explanation must be environmental; it must account for the fact that there are greater stimuli which require fielding in speech than in writing. To field a desire is to acknowledge it in such a way that the unmet desire will not, out of ire, induce harm or sever our faculties; in other words, to field a desire is to direct it to its outlet. One can field only so many desires (hence the recurrent metaphor of “juggling” tasks), and desires are often competitive rather than collaborative. The writer/artist must reduce the number of desires that are felt in order to produce art.
To put it succinctly: The things one says to oneself are necessarily dissimilar to those one says in the presence of others. This is not because one is inauthentic. In fact, authenticity might mean saying what others need to hear. (E.G. the therapist does not have some inner, authentic disclosure which will remedy the client’s situation; rather, the therapist must determine advice relevant to the client’s circumstance.) Or, in the less charitable sense, one can say that we all are mutually engaged in the process of fielding each others’ desires so as to avoid harm and conflict. The artist, then, exists in a privileged space where the only desire to be fielded is that of the art’s production. A painter carefully meditates on several shades of red, deciding which pigment is best applied to the widow’s brooch. This meditation enacts a prolonged exposure to a single desire, the desire of color, for the purposes of determining how to field it. The assumption is that the artist’s desires will later be shared by audiences.
(Humanities scholars after the linguistic turn have focused on the various contextual influences whereby words and symbols lose their meaning, whereby the artist’s desire is no longer shared by contemporary audiences. The more worthwhile task might still be to investigate how such a desire can be shared in the first place, rather than find historical examples which demonstrate that those desires change over time.)
All this to make what to some might already be the obvious claim that written communication separates itself from oral conversation, insofar as writing/reading expands one’s presence in time by shrinking one’s relation to one’s environment.
So the writer, then, holds a particular allegiance to memory and to independence. But how often do writers accept this allegiance–to what extent do they reliably devote their time laboring at the anvil of an uninterrupted thought?
When asked the highly banal and puerile question of how to write today, Martin Amis responds: “In the old days, an evening alone with a bottle of wine and an eight hour read of me seemed like the perfect evening. But now I only do it–[because] the future is getting smaller and the past is getting bigger, and you have to head forward–when the book comes out.” I should remark that I only came across this video because I had just read Lolita, without fully being able to understand the impact that Nabokov’s mere prose was able to have on me, and had read an article which compared Martin Amis favorably to Nabokov. Additionally relevant for the timing: I had begun my own foray into academic writing, and had become addicted to it. Spending hours revising the same few sentences, cutting my work into the appropriate page-length, all in the hopes of receiving positive affirmation from professors–I could neither explain how to achieve success in writing, nor why such success became an obsession for me. So, given this context, Amis’s words struck a chord: “I think the sentence will give a sort of pleasure without you being able to tell why. That’s what I hope.”
Like Amis–spiraling into himself until landing at the bottom of a wine bottle, to reach the final sentence in a drowsy stupor at the end of an evening–Nabokov, too, had a highly self-involved process for the making of Lolita (a work which he said compelled him as it “had grown in secret the claws and wings of a novel”): he would write and re-write the same set of flashcards over and again. As he put it: “my pencils outlast their erasers.” Or as yet another example of this writerly method: George Saunders says that he often starts by putting an idea, a sentence, on the page, then trusts himself to be able to write his way out of it. For Saunders, the process of writing a story might be more analogous to editing a sentence into a paragraph, into a page, etc.
To put myself in their favorable company: I, too, am unable to submit a written work until I have read it from start to finish in a single sitting, without hitting any breaks or brakes–neither stammers nor questions.
It seems, then, that the writer is a concatenist, a connoisseur of moments, who must chain sequences into a fluid event, a controlled flow. More than the cinematographer or even the chef, the writer is an organizer of data (from every possible sense that can be named) into some product called narrative–a product comprising that thing sometimes called language or melody or any other nickname besides its more domineering moniker: Time.
Troubles inevitably arise; obstacles pop up even as the writer attempts to order time so as to make it un-obstructive for the reader. The catch: the writer is never sure whether the flow from sentence to sentence develops as an idiosyncratic habit, like those in Sunday School who are able to say their prayers while contemplating lunch. Hence why, in an essay on the craft of writing, Zadie Smith gives the following as her most importance advice: “[Leave your work alone for years before you revise it, because t]he secret to editing your work is simple: you need to become the reader instead of the writer…You need a certain head on your shoulders to edit a novel, and it’s not the head of a writer in the thick of it, nor the head of a professional editor who’s read it in twelve different versions. It’s the head of the smart stranger who picks it off the bookshelf and begins to read.”
The cases of Smith and Amis interested me first because they demonstrate the narcissistic quality of writing, where one must enjoy oneself in order to spend time alone with one’s thoughts, and where one–in the style of Pygmalion–risks falling in love with one’s fallible creation. Or as another famous writer, Dante, put it: “For in every action what is primarily intended by the doer, whether he acts from natural necessity or out of free will, is the disclosure of his own image. Hence it comes about that every doer, in so far as he does, takes delight in doing; since everything that is desires its own being, and since in action the being of the doer is somehow intensified, delight necessarily follows.” This kind of delight can certainly be found whenever one has had the opportunity to delve into one’s own thoughts; as Amis puts it: “an evening alone with a bottle of wine and an eight hour read of me seemed like the perfect evening.” Clearly, there is innate within us a capacity for self-absorption, even as I remain curious, like Dante, as to just how “in action the being of the doer is somehow [quoddamodo] intensified.” But there is a limit to Dante’s hypothesis, for how boring would it be if the world would conform entirely to our image, if we were thereby left entirely alone?
The excerpt from Smith demonstrates this tension between Self and Other, demonstrates a temporality which is stretched between the natural propensity to start anew, and the artist’s desire to perfect. Because I take this tension seriously, I find it highly ignoble when acclaimed artists disparage their own work (as Smith does), either because it betrays a narcissistic ideal which exceeds feasibility (I often think to myself: “It’s just writing; I’m not curing cancer”), or because such remarks warrant the logical criticism: why didn’t you keep working on it, then? Or at least withdraw the product once you recognized its flaws?
Smith’s advice, to save one’s work until the time that one could react to it as a stranger, ultimately represents a desire to become one’s readers in order to pre-empt any criticism from them. It is a desire to obtain every possible perspective such that there would remain no Other, no outside person whose vantage point offers leverage for criticism. The advice turns out, to some extent, to be useless, at least insofar as it gives in to the fear that there will always be some later time where one knows better, just as there will always be some well-informed stranger who can develop a more educated opinion than one’s own. In saying this, I mean to suggest that the writer dances precariously between a narcissistic confidence which absorbs the world into oneself (the know-it-all) and a masochistic denial which risks denying one’s place within the world (the know-nothing). Importantly, Smith has not succumbed to the masochistic denial her advice could risk; indeed, it is a piece of advice that the prolific writer has herself not followed.
Despite my criticism of Smith, I admire that mentality within us which warns that there will be a later version of ourselves who will know better. My position is that no truly great art can have a sequel, and that the tendency towards fictional series demonstrates a vulgar escapism. The rupture from one work of art to the next should be total: either the original failed to meet its task, in which case an edit might be more appropriate than a sequel; or the proceeding work should be so distinct as to make the term sequel absurd.
Although, my preference for artists who produce few works may be unfair; people have got to pay the bills somehow. (Even this innocent “pay the bills” assumes that one cannot build one’s own forms of sustenance and must therefore surrender to the machinery of capital, a circumstance which is not unusual but indeed is not universal.) However, putting aside such pragmatic causes as needing to support oneself, it also seems that writers (and artists in general) have a tendency to lose themselves in the institutional maze-of-mirrors which continues to validate them. The scenario is like that of the average couple, who, despite their respective mediocrities, assure each other that they are the greatest, thus generating an addiction to a toxic relationship by means of inflated complacency. Such an addiction is often Romance. Readers gain pleasure in the company of their Romanticized writers (Thank God! There is someone out there who knows!), even as the same writers face the relief that there are readers who believe in them. All this to say that writers need not be bad-faith money-grubbers when their work becomes so prolific as to have lost its quality, its coherence.
Relatedly, some historians have asked the pertinent question: “when does one have enough sources?” The rhetorical question reveals that historical writing is an arena where there exist neither probabilistic models nor the widespread p-hacking with which the historian can verify, or at least gain confidence in, one’s result. So the institutional pressures to publish may therefore serve a social function; without them, the timid historian waits too long to share what would otherwise be politically expedient/meaningful research. But, insofar as literature is devoted to telling meaningful lies, it has divorced itself from any claims to historicity and timeliness. That is to say, there is a difference between the everyday person’s hesitancy to speak out in a social/political/romantic way, and the writer’s more general dilemma of knowing when a work is done. Everyday speech has as its target a desire to change/influence others’ behaviors (can you pass the salt?), which makes grammatical/aesthetic corrections meaningless so long as the behavior is met (the salt is passed); whereas, the literary art separates itself from everyday speech through its allegiance to a symbolic form which warrants grammatical/aesthetic correction.
To explain with a different example, it is commonplace to refer to incomplete intellectual work as “half-baked.” The metaphor might only intend to say that additional time was required for an attempt to be successful, but it also assimilates writing as a reproductive, culinary art, one which faces a tension between freshness and completion, a search for perfect ripeness. So when intellectual labor is ‘overwrought,’ it has–to continue the previous metaphor–been left in the oven too long. However, “overwrought” to me is not a possible term for literary criticism, insofar as it’s impossible for an art that desires the most vivid description to be too descriptive. Rather, one could always put “half-baked” in place of “overwrought”: it was not that Joyce spent too long knitting his words for Finnegan’s Wake, but that he did not devote enough effort to incorporating recognizable structures within that ornation. We might similarly wonder why artists are not themselves the map-makers, and why we need critics to navigate art.
In support of artists as their own map-makers, Borges makes the controversial argument that prose requires greater intellectual complexity than poetry, and his reason for justifying this claim is akin to saying that the prosaic writer is a bowler with no bumpers. There are few modes with which I can meet the required rhyme of “fair,” in the required meter of a tredecasillabic line, while expressing that “groggily, Timothy descends–a thump on each stair.” When the audience need only verify adherence to rhyme and meter, considerable pressure is taken off both artist and audience to cultivate the senses. While novelists can still utilize what Kristeva might call the semiotic pleasures of their text–its rhyme, meter, consonance, assonance and alliteration–their allegiance is ultimately to the referent, the symbolic meaning of their texts. Anyone who seriously attempts literature engages the impossible task of telling meaningful lies.
However, the contemporary writer faces even greater pressure than did novelists uncertain of their word choice; not only has the formal structure of the written art become an impossibly plastic, ungraspably Protean form, but today’s aesthetically inclined author knows no longer when one’s vocational call ought to drive one to the page or to the needs of others–and even then one must negotiate whose calls to answer. So the “death of the novel” might be lamented by the short-sighted author who is unable to see that the political/artistic modes of engaging with the world have been greatly expanded. Instead, an author can decry a mass culture which has drained the audience’s sensibility and attention, when precisely the opposite has occurred, when a great deal of attention is paid to a greater variety of subjects than ever before.
When literati elegize “The Death of the Novel,” they do so with the intended implication that the arts are not sufficiently appreciated. I find such arguments loathsome, insofar as they often center some narcissistic myth wherein artists, despite their worth, cannot force their deserved recognition. Van Gogh is the standard exemplar of this modality in art, just as Giordano Bruno has lately become a beacon for philosophy and science. Often, the cultural elite assume that the art’s beauty outpaces the audience’s capacity for reception; yet, they do not put emphasis on the artist’s responsibility for framing the art, for giving the audience the capacity to receive it. Under the banner of “authenticity,” it therefore becomes possible to forego all responsibility to the other. In order to answer the question here posed–“what is the beauty of the written word?”–one must first be willing to acknowledge all that writing cannot accomplish. Contemporary writing might therefore repeat itself in much the same way that a doctor can continue to give an improper cure to a dis-ease that fails to cease. What is the beauty of the written word?