This essay began as a footnote in a graduate school paper. My alma mater, Wheaton College, has a strict but standard policy on inclusive language. She states in her catalog:
We want our students to succeed in graduate school, in the corporate world, and in public communication, all settings in which gender inclusive language for human beings is expected and where the inability to use such language may well be harmful to the Christian witness.
This policy is vague. It expects inclusive language without further statements on what qualifies as such. Shrewdly, the college leaves the task of clarifying up to the faculty. Yet most syllabi (granted, most syllabi I handled were limited to a single department) simply restated what’s found in the catalog. Some syllabi don’t bring up inclusive language at all. These might assume that students already know what constitutes inclusivity and how to implement it and have already worked out the grammatical difficulties it creates.
The footnote was in a paper for a class with one of these vague syllabi. I was writing about Origen of Alexandria’s anthropology. Origen analogizes the soul with the Bride in the Song of Songs and Christ with the Bridegroom. His anthropology is a spiritual romance between the soul and her Lord, Jesus Christ. For this reason, I defaulted to feminine singular pronouns throughout my paper. I thought this an inventive way to impress the terms of Origen’s thought more deeply on readers. Feminine singular pronouns are still non-inclusive, so to avoid any marks I figured an explanation was in order. Hence, the footnote. The present essay deals with the subject more fully.
This is a single essay in four parts. In the first part, I write on inclusive language generally; in the second, on inclusive pronouns specifically; third, I present failed solutions to our pronoun troubles; in the fourth, I chart a viable course out of these troubles. The fourth section is the real heart of the piece; the first three set it up. Characteristically, I am expending perhaps more intellectual energy than the subject deserves. If you are short on time, skip to the fourth section.
Throughout this essay, I adopt the outlook of a writer, as opposed to a grammarian, activist, etc. In this view, the main task of language is to convey clearly and creatively whatever point the writer is trying to make. My main concern with inclusive language is the same as with language of any sort: whether it supports this task. Subsequently, you will find I resist evaluating this topic in terms of “correctness” but in terms of “usefulness”—a craftsman’s ethic.
When given limitations, it is best to respond creatively whenever possible. We should bother asking: Why prefer singular pronouns, anyway? Is there a possible advantage to using plural pronouns, or using feminine instead of masculine pronouns, and what advantage would that be? By the same token, we should be free to ask: Does extreme endorsement of inclusivity at times harm our ability to write effectively? In the best writing, every element of a piece works in tandem to bolster the thesis. This should include parts of speech as seemingly obscure as pronouns, over which the writer should possess some creative choice. How this can be done, with examples, will be shown in the final section.
In the best writing, every element of a piece works in tandem to bolster the thesis. This should include parts of speech as seemingly obscure as pronouns.Tweet
I. On Inclusive Language and Its Merits
I generally don’t mind the enforcement of inclusive language. Exchanging man with person or mankind with humanity is quickly done and seldom bruises the aesthetics of the prose. More importantly, it never fails to promote clarity. In particular, the word man has multiple meanings and the inclusive alternative is more precise. In fact, the English language is unique in the sense that it uses the same word for male adult and a human being (of either sex)—most languages have different words for each. Latin, for example, uses homō for human, vir for man, and fēmina for woman. Without care, the ambiguity of man in English, I think, can obscure the meaning of a text.
This is most apparent in translation work. For example, I have met many knowledgeable Bible readers who assume when God said in Genesis, “Let us make man in our image,” that God has Adam in mind (Gen. 1:26). However, in the original Hebrew, the sense is inclusive. The word (אָדָם) transliterated into the English alphabet is adam. This word has at least three meanings. As a proper noun, it refers to the first human, Adam. It can also refer either to an individual human or to the sum of all humans, men and women alike. The collective use is the most common in the Old Testament. Taking this into account, when God sets out to “make man,” He has the entire human project in mind, both “male and female” (Gen. 1:27), and not Adam alone.
If we assume adam refers exclusively to the first male in this passage, it threatens proper exegesis. We would have room to suppose, for example, that only Adam, not Eve, is made in God’s image, or that Eve is an afterthought whose existence adds nothing to the perfection of God’s creation. This confusion is easily avoided if we render the text: “Let us make humanity (or even a human being) in our image,” although I am aware many English Bibles do not do this. Even so, I strongly support using indisputably inclusive language when the text intends inclusivity. A biblical exegete with an elementary knowledge of Hebrew will interpret the text correctly, even if the translation of אָדָם is man. However, such textual details are seldom noticed by the lay reader.
A clarification: I am not suggesting that the words man or mankind are truly exclusive. The English language boasts an extensive body of literature that uses these terms in their inclusive capacity. Yet this is not, in fairness, intuitively apparent to modern readers. I endorse the exchange of mankind with humanity not because I find the former exclusive (which I do not), but because the latter is easier to interpret. You could say my advocacy of inclusive language has little to do with my valuation of inclusivity: It just so happens inclusive vocabulary tends to be more precise.
A second clarification: I do not think that human is preferable to man in all instances. As a writer, I like a varied cache of words in my storehouse. Beyond its dictionary definition, every word has proportion and texture, both when spoken or viewed on a page, as well as flavor and density acquired through an aging process. The physics of aging differ for words than they do for bodies. Usage—where, how, and how often—rather than time is the force behind a word’s aging. For this reason, no two words age at the same rate: Some words are mayflies, others are mountains. Speak a word and it carries scents collected from contact with prior works of literature, music, poetry, as well as laws, speeches, lectures, studies, and historical events, places, and personalities. The work of the writer is to pair these qualities tastefully with other words and themes in a manner that supports the overall goal of the piece. Human is more precise, but a good writer does not judge any word on a single valuation. The quality of precision that makes human preferable to man in most instances also makes it less useful for poetry, song, or momentous occasions. How can we revise the words of Hamlet “Oh, what a piece of work is man!” without at the same time diminishing its luster, its concise elegance, the ponderous way the breath presses on that final consonant? That said, academic writing, particularly in the fields of philosophy, theology, law, and history, will always prefer the most precise vocabulary available. I am at a loss to think of an example in an academic or didactic piece where man is preferable to human.
I anticipate an objection: Man and mankind, it could be argued, already boast a great legacy of service in academic writing. This is true. Before the modern demand for inclusive language, man and mankind were default words. They served with distinction, but their status was secured by force of convention. Some academic writers today would argue that this heritage should be preserved; they may throw in a few comments about how man and mankind are more literary, historic, or lyrical than the inclusive alternatives, and therefore should be acceptable in academic writing. This is not a weak case, exactly (my prior statements would suggest it has some merit), but those academics who used mankind in a literary and historic way in centuries past did not realize they were being literary or historic: They thought they were being conventional, which is to say they didn’t think about it at all. What distinguishes early modern academic writers from the contemporary crowd is the former did not choose to use (or not use) mankind and the latter does. As shrewd and respectable as some academic conventions may be, they are not above reevaluation, and we should feel free to honorably discharge them for something better. As a writer, I think I am positioned to suggest that inclusive language improves on the conventions of academic writing.
Some see in the words man and mankind the residue of a historic pattern that systematically favors the male sex as the representative of choice for all human beings—and may even idolize the male as the ideal human being—in the English-speaking world. Whether intentionally or accidentally, as the argument goes, this vocabulary is sexist. The force of this charge tends to override the points made earlier, for how can we justify the use of these words in any context, once they are labeled sexist? What is there to do with them, except strike them from the dictionary and hope their memory follows suit? And what should we call advocates of this view, if not prudes? Prudes is a fitting title, insofar as it describes those who are above all concerned with the propriety of language. For them, words like m*n and m*nkind assume the stature of parasol-clutching profanities that ought to be censored or revised aggressively wherever they appear.
I am not qualified to judge on the morality of words of like man and mankind, or even of patent cuss-words like shit. Again, I wish to write as a writer, and as a writer it is not evident to me that the provenance of these words in English is reducible to an underlying patriarchalism. As suggested above, the qualities of the words must play some part in their success. The Germanic-based man not only has a more extensive pedigree in the English language (by virtue of appearing sooner), but its brevity and layers mean it can do things the Latin-based human cannot. We stand to benefit from looking at the word itself from time to time, rather than the bogeyman behind it.
At any rate, my endorsement of inclusive language is unattached to any desire to cleanse English of non-inclusive vocabulary. Mopping a color from my palette limits my options as a writer, yes, but more significantly it reflects an ideological perspective on language and our relation to her. Language is a complex entity, almost like a living creature, with memories and intentions and limitations of her own. She is bad at submitting to ideology and recasting herself in its image. Nobody stands over a language to do as he pleases, even a language as adaptable as English. This is given by the fact that it is surprisingly easy to introduce new words into her vocabulary but nearly impossible to scratch out old ones. Words may fall from fashion, yes, but they have a habit of bouncing back. Even lost words leave traces in other words.
When it comes to language, we are in a better position to create than to destroy. Why else does censorship tend to have the opposite intended effect? Attention strengthens a word, even if that attention is negative. Especially in the modern age, a word can become a profanity simply by being declared such. Why else does shit carry so much more emotional capital than crap, despite the two meaning the same thing and taking the same time to say? Whatever the moral question, a program of censorship is impractical and likely to backfire. Finally, the more interchangeable two words, the more likely one is to diminish. Like the twin founders of Rome, Romulus and Remus, one must kill the other for supremacy. As should be obvious to readers by now, man and human are far from interchangeable. Man and mankind are here to stay, so it seems to me best to make the most of them, to find suitable use for them, and even to learn to appreciate them, rather than to regret their existence.
As shrewd and respectable as some academic conventions may be, they are not above reevaluation, and we should feel free to honorably discharge them for something better.Tweet
II. On Inclusive Pronouns and Their Demerits
Up to this point, we have explored the merit of inclusive language. So far as the issue is a matter of word choice, the solution seems to me straightforward. In academic writing, inclusive speech is almost always more precise. We should prefer it, and this can be done without disparaging words like man or mankind. The issue becomes more complex when we leave the realm of nouns and consider pronouns. One side prescribes using English plural pronouns (i.e., they, them, their, theirs, and themselves) in all cases as though singular (e.g., “When a person is tired, a cup of coffee can perk them up.”). This, they argue, is already commonplace in colloquial speech, so why not in academic writing? We will call advocates of this view colloquialists. (I call them this instead of, say, inclusivists because the colloquialist speaks to the nature of their solution rather than the values or belief systems undergirding it—which I consider better naming practice.) The other side, those we might call the grammarians, object that this practice frequently breaks multiple rules of English, such as antecedent-pronoun agreement and subject-verb agreement. The colloquialists, they say, impose a standard the English language cannot support. Each side accuses the other of great offense: one toward readers, the other toward language.
The grammarians might toss in a story: While speaking at the Council of Konstanz (1414), the Holy Roman Emperor Sigismund misused the word schisma (“division, dissent”). Presumably confused by the word’s -a ending, Sigismund treated the word as feminine when in fact it is neuter. (Many a Latin student over the centuries has fallen victim to similar stumbling blocks.) When those present called attention to the error, Sigismund invoked his authority as emperor and declared that, henceforth, schisma would be feminine. Unable to endure this, the elderly Archbishop Placentinus wobbled to his feet and replied: “Caesar nōn suprā grammaticōs,” which translates: “The emperor is not above the grammarians.”
We are inclined to interpret the story above as a boon for grammarians everywhere, as it places them above the highest authority in the land. The lesson is clear: Presumably, whatever issue may exist regarding language and its proper use, the grammarian has final say. But by this time, Latin was long dead. Everybody present at the Council had learned Latin as a second language. Latin was not only dead, but venerable. It belonged to no government, country, or people, but to the fields of theology, philosophy, economics, literature, and law. Sigismund might have been emperor, but he had ventured outside his jurisdiction. Concerning Latin, even the emperor is a student: His task is to learn and obey, not revise. Placentinus laconically put him back in his place. His statement—“Caesar is not above the grammarians”—has become a motto for the grammarian crowd, but the story bears little analogy to a controversy involving a language as lively as English. Here, the authority of the grammarian is less absolute.
A writer should have a strong grasp of grammar, but he is not a grammarian. The two work on the same floor, as it were, and work best when they consult each other, but they have different concepts of language. The interest of grammarians is the architectural integrity of language, whereas the writer takes up residence in language. The latter is concerned with its utility, fecundity, and meaning. This simplistic distinction will have to do for now: The point is writers and grammarians are distinct and may disagree. The decrees of the latter do not necessarily end a question for the former.
By the same token, the writer may not fall on the side of the colloquialists even if he sympathizes with their position. It may seem counter-intuitive with a word like inclusivity, but this word represents a limitation on what language is permitted. With inclusive policies, our vocabulary is not being “diversified to include more” but “limited to exclude fewer.” Whether justified or not, let’s keep in mind the actual nature of inclusivity is to censor, to set a limitation on language in an effort to prevent limitations on who is represented. Language is only inclusive if it is exclusively inclusive. We may find the intent noble and the limitations acceptable, but advocates are essentially “grammar Nazis” consulting a different book of rules. Like the grammarian, the colloquialist purports his position dogmatically.
This prompts an evaluation of terms. We tend to oppose inclusive to non-inclusive—or to exclusive, if we like drama. How is this choice of terms framing the discussion? It means the alternatives to inclusive are defined either by their lack of qualities (i.e., non-inclusive) or by an aggressive quality that offends a value we hold dear (i.e., exclusive). Speaking this way not only predisposes us from the beginning to favor a certain set of words (because who wants to be exclusive?), but it fails to speak to the positive qualities of all words involved. (I hope by now my reader is convinced that a word like man is neither a featureless blob nor the verbal equivalent of a cancer cell—and no word can be both.) Rather than non-inclusive, it is more accurate in the case of singular pronouns to say they are particular. This is what singular pronouns do: They refer to a concrete individuation of something or someone, down to the details of gender. Inclusive language, on the other hand, deals in broad pluralities. Especially for pronouns, the categories of inclusive and non-inclusive correspond to general and particular, respectively.
Alternatively, the grammarians and the colloquialists can be appeased in a single stroke by simply converting every element in a sentence into the plural: “Humans are to themselves contrary creatures; unlike beasts, they can act against their better judgments.” Both parties should be pleased now that the sentence is both grammatically correct and inclusive, but something else has been lost. If you read carefully, you will see the subject matter has changed. To speak conceptually of the human being, as such, differs extremely from speaking in general terms of all human beings.
How is this harmful? Rendering all antecedents into the plural restricts our ability to speak philosophically, conceptually, theoretically, theologically, etc. It is one thing to say, “To err is human” and another to say, “All humans err.” The first declares a universal about the human condition; the second is an observation of general human behavior. The difference is equivalent to speaking of an entity or universal (at any rate, a single subject) instead of a collective (i.e., many subjects).
Beyond the subject, there is also a change in tone. The second—“All humans err”—is almost detached: The humans of which it speaks could be other humans. The first is existential, enlisting speaker and hearer like under the substantive human. These separate statements obviously resemble each other, yet they are not interchangeable and belong in different papers.
In conclusion, inclusivity is an accident of general speech. Inclusive language pressures writers to choose plural language in all instances, and plural language suggests collective speech. This, by itself, is no case against inclusive language. It does give us pause, perhaps, to consider its full strain on the effectiveness of our writing. In addition to running the risk of clouding the identity of antecedents, exclusively inclusive language also restricts our ability to speak in particulars. This can thwart effective writing, particularly if the subject deals in hypotheticals, universals, or concepts. Naturally, the concern enters greater relief in an academic context where this sort of subject is commonplace.
In addition to running the risk of clouding the identity of antecedents, inclusive language restricts our ability to speak in particular terms of hypotheticals, universals, or concepts.Tweet
IV. On Failed Attempts at a Solution
Before I arrived at my “solution” to the problems set out above, I experimented with a few others. Each of these has its own merit but proves too impractical to implement.
A Language Correspondent
In a paper for a theology class, I disclosed in a footnote that the gender of every pronoun in the paper corresponds to the Latin equivalent of the English antecedent. So the word church is a she, after the feminine ecclesia. A human being is a he, but a person is a she. A poem is an it, but a poet is a he. Theology is a she, as is philosophy, art, love, wisdom, and reason. It goes on.
Only a narrow readership could appreciate this approach. Even if I had made a mistake, many of my peers were not equipped to correct it, much less to apply this method to their own writing. In the end, it is too impractical to serve as a prescriptive solution. Moreover, it suits a narrow genus of writing. The thesis must work in close association with a different language with grammatical gender. For the paper in my theology class, this was appropriate since many of my sources were Latin.
Though impractical, this approach delights me, and I hope someday to find an audience who will be humored by it.
An Academic Neo-pronoun?
Is there a case for inventing a new pronoun that is both particular and inclusive? The appeal of plural pronouns is they don’t presume the antecedent’s gender. Theoretically, neither does the neuter singular it. But it does not pass as an inclusive pronoun because it is impersonal. It suggests something neuter, a word which both in the original Latin and in English grammar means “neither male nor female.” (In biology, neuter means genderless. In grammar, neuter is a gender that is neither feminine nor masculine.) However, a duplicitous pronoun that is uter, that is, “either male or female,” could fit the bill.
Supposing, for the sake of argument, we must invent a new pronoun, how would we spell it? We will use the Latin uter as the basis for an English derivative. Furthermore, given the etymological proximity of uter to neuter (the latter is simply uter with the prefix ne-: “not”), perhaps the neologism should model itself after the English neuter singular pronouns it, its, itself. This leaves us with something like: ut, uts, utself. This pronoun has no special plural form: Like all English pronouns, ut, uts, utself is subsumed into they, them, their, theirs, themselves.
To delineate its proper usage, ut differs from it because its antecedent is a personality. While it applies to an antecedent that lacks either biological gender or humanity, the antecedent of ut possesses a gender, but it is either or any gender, and whichever gender it happens to be is not significant for the piece. To be clear, ut does not convey uncertainty or confusion about the gender of the antecedent, only indifference. Examples of appropriate singular antecedents for ut include the concept of humanity or a hypothetical individual or a random, nameless individual. You could not use ut for a person whose gender is apparent or significant. Unlike other neo-pronouns (e.g., sie, zie, hir, zir, etc.), ut is designed for academic writing.
I will demonstrate proper use with a brief paragraph on human nature I wrote for an academic paper, but I’ve incorporated the “appropriate” pronouns:
The Incarnation of Jesus Christ is the final stage in God’s creative human project. His self-imposed commission in Genesis, “Let us make man in our image,” is answered, as it were, on the cross: “It is finished.” Thus in the humanity of Christ, the rest of humankind sees what ut was created to become. In Christ, humanity perceives uts model, not only morally but ontologically. By becoming like Christ, the human being becomes utself truly. Emphasis must be placed on like, for a human being cannot become Christ, like a fourth member of the Trinity, but can only come to resemble him through imitation, just as it is not within the capacity of a mirror to assume the image it reflects, rather its nature is to reflect. In this manner of speaking, it is the mirror that is assumed by the image. Similarly, the actual human being is the imitation of Christ. By becoming what we are, Jesus Christ reveals to us who we are. Thus the Incarnation is the first principle of Christian anthropology.
I admit I am being facetious. I’m trying my hand at a problem. Idle play. I don’t expect this neo-pronoun to catch on; you will hear no lament from me when it doesn’t. The nod to the Latin uter notwithstanding, the word is an obvious contrivance. You might think it sacrilege or just silly, still, I think during this ordeal I have appreciated the spongy character of English, which allows for this kind of invention, while respecting her rules. If we must resort to neologism (a big if), I consider ut, uts, utself a candidate of some quality.
Supposing, for the sake of argument, we must invent a new pronoun, how would we spell it?Tweet
IV. The English Dilemma
Unlike many other languages, English lacks grammatical gender. This means the gender of any pronoun is not predetermined by the gender of its antecedent, with which it must agree. Instead, grammatical gender in English is based on external knowledge. In English, the neuter pronoun it, its, itself is used when the antecedent lacks biological gender (e.g., a stone, a machine, a corporation, etc.) or personhood (e.g., an animal). Exceptions are often arranged to express closeness or affection for something typically spoken of in the neuter, like a family pet (i.e., a dog is an it, but Spot is a he) or a personal item (e.g., “My first car was a blue Toyota pickup. That girl was so old, she had termites.”). While neuter nouns on occasion can be “gendered” in this way, neuter pronouns are never used for persons.
The English writer meets a dilemma. When speaking of a single person (or of the person) of undisclosed gender, what gender should be used for the pronoun? For centuries, the default has been singular masculine pronouns—particularly in academic writing—but there is no reason why this must always be the case. There is nothing in English grammar that would incline us to favor one grammatical gender over another; at least, not without seeming arbitrary. The dilemma creates an element of choice, a choice that presumably falls to the writer.
My recommendation is simple: Instead of attempting to decide on a default gender for all pronouns in all examples of academic writing, we should exploit the English dilemma to write more effectively. Earlier, I mentioned I used feminine pronouns in a paper to highlight a theological point found in a longstanding Patristic tradition that characterizes the individual soul as the feminine lover of Christ, her Heavenly Bridegroom. For that paper, feminine singular pronouns lent the greatest support to my thesis, but I might have chosen different pronouns for a different paper. The point is that instead of seeking a gold standard, writers should carefully select whichever pronoun forms best support the goal (or thesis) of their writing. The result will be the most effective writing possible.
I am not the first to employ this outlook. Nancy Caciola, Professor of History at the University of California, defaults to feminine pronouns in her landmark study on medieval witchcraft, Discerning Spirits (2003). She does this to highlight the fact that during the period in question women were more often accused of witchcraft than men. Caciola does not mean to suggest that all cases were women, as a substantial minority of were men but her thesis that “the testing of spirits focused with particular intensity on women” stands.* Instead of following the dictates of convention or the “new prudishness,” her choice of pronouns complements her work.
A second precedent is Umberto Eco’s How to Write a Thesis (2015). This is a handbook, not a study, but the audience is academic. Eco favors the masculine forms when referring to a student, candidate, professor, or advisor “because I drew on personal memories and experiences, and I identify better with the male counterpart.”** This is acceptable given the tone of the work: The reader feels like a student paying a personal visit to his academic advisor. (The reason I use masculine forms in this essay is identical to Eco’s.) As with Caciola, the choice of pronouns is deliberate and deeply expressive of Eco’s goals. His example also highlights the principle that a certain gender may better suit certain authors, genres, audiences, or literary styles as well as goals or theses.
Caciola and Eco are examples of published academics who have succeeded in their fields. They write for the academic world, but rather than yielding unthinkingly to either the new prudishness or to historical conventions, they negotiate forwardly and reasonably without offending the sensitivities of their readers, but in a manner that makes their writing more effective.
It depends on the thesis, but the writer should not leave it to readers to assume there is special intention behind his pronoun forms. Rather, following the examples of Caciola and Eco, the writer should be upfront, explaining early in the piece the significance of his choice and how it supports the piece overall. This can be done in a footnote: Eco uses an endnote; Caciola uses a paragraph in her introduction. The point is the explanation should appear early and prominently.
If frequently writing a statement explaining one’s choice of pronouns seems laborious, keep in mind the use of they as a singular pronoun often requires a similar practice. Although they have made inroads, inclusive pronouns are not yet universally accepted in the academic world. Institutional policies on the precise use of pronouns are usually vague and left to the discretion of individual professors or editors, who are seldom any clearer. Many will still mark them as grammar mistakes. If a student of conviction wishes to use inclusive pronouns, he often must explain his choice clearly in the opening paragraph of his paper anyway. It seems to me that stealing a few lines to explain one’s choice of pronouns is becoming a new academic convention, and I’m all for it.
Instead of attempting to decide on a default gender for pronouns, writers should carefully select whichever pronouns best supports the goal (or thesis) of their piece. The result will be the most effective writing possible.Tweet
Inclusive language can be viably suspended to make room for stronger writing. To be clear, in most cases I would argue inclusive language is the better choice, especially in an academic context. In my book, the buck stops on the enforcement of exclusively inclusive language to the point that all pronouns are pluralized. By the same token, I think writers who resort to the historical convention—that is, exclusively singular masculine pronouns—may deny themselves an opportunity to write more effectively. While I am not convinced the historical default is patriarchal, it is in any case imaginary.
My “craftsman’s ethic” is simple: Not only the research but all elements of a piece, including the pronouns, should rally to support the overall goal or thesis, and it seems to me that both inclusive pronouns and conventional pronouns thwart this. Instead, I endorse a more creative outlook on what language is “appropriate”: Namely, the most appropriate language is that which makes the writing most effective. From this outlook, pronouns ought to be curated and meaningful, arising from the writing process rather than dictated to writers in advance.
Exploiting the English dilemma and deciding for yourself which pronouns are best should be an option given to students in classrooms and syllabi. Naturally, it takes a skilled writer to use pronouns effectively. I suppose the appeal of a default is it need not presume excellence of anybody. Exclusively inclusive language can be required without any confidence on the part of institutions in the abilities of students. Why else enforce a policy on a matter that has so little basis in grammar? The incompetent cannot be instructed, so they must be restricted. A final point in favor of my outlook: It thinks more highly of writers by seeking to give them liberties to write better.
I endorse a more creative outlook on what language is “appropriate”: Namely, the most appropriate language is that which makes the writing most effective.Tweet
*Nancy Caciola, Discerning Spirits: Divine and Demonic Possession in the Middle Ages (Ithaca, NY: Cornell University Press, 2003), p.16.
**Umberto Eco, How to Write a Thesis. Trans. by Caterina Mongiat Farina and Geoff Farina (Cambridge, MA: Massachusetts Institute of Technology Press, 2015), p.225.