Writing Rules and Genre

This semester I’m having my students write in a number of genres. That’s not a bad thing (actual results may vary…). But, as happens every semester, we’re struggling. We’re struggling because my class isn’t my students’ first exposure to writing (my students are adults–they’ve actually been writing for a LONG time, whether they realize it or not) and they’ve been taught a LOT of “rules.”

Getting the writing done.
Image by Kristin Hardwick via stocksnap

One of my favorite first year composition textbooks that I’ve taught out of, Writing About Writing by Wardle and Downs, includes an excellent essay by Mike Rose that discusses how writer’s block can be the result of viewing writing advice as rigid rules. As he notes, student writers can easily be confused by all the “rules” they pick up from various instructors that seem contradictory (because they are) or are genre inappropriate and therefore actually wind up getting in the way. He argues that successful student writers are those who are able to see writing advice they’ve been given as a “heuristic” rather than a hard rule. These students are able to modify or discard rules as they become problematic or less useful for their projects.

https://tenor.com/embed.js

The challenge, then, becomes how we can foster this flexible approach in our own students. As I often argue for many questions, I think the answer to this is one word:

Genre.

Students need genre awareness, but, perhaps more than they do, we as teachers need genre awareness when we teach our pet “rules.”

We teach our rules as though all writing is writing. But it isn’t; writing is always genre-specific in some way, especially the conventions we use. Grammar, diction, formatting, all of it–it’s genre-dependent. Not even context, but genre.

When we teach “don’t use passive voice,” what we’re teaching is write like a journalist or a fiction author. That won’t work if they’re trying to write an academic paper on a scientific topic. In fact, it’ll directly conflict with when we teach “don’t use first person”, which is appropriate for writing an academic paper on a scientific paper but not for many other genres, because avoiding 1st person often triggers passive voice.

Since genre is directly tied to the audience’s position and the author’s purpose, we need to make genre the first and foremost thing we teach when we teach writing rules.

In fact, I would argue that any “rules” that exist for writing are actually genre features. Genre is basically the “game” we play when we write.

When we teach rules absent genre, it’s as if we told all athletes that they need to focus on throwing round balls. Generally useful for several sports, but football players (who do also throw balls) find themselves at a disadvantage because round balls don’t act like footballs, and soccer players go “umm hands? No.” and hockey players don’t use balls at all, and so on. Likewise, if we tell an athlete “always run fast,” that rule probably doesn’t apply to, say, weight lifters or rowers, who are still definitely athletes doing sports.

Or, consider role-playing games. In Dungeons & Dragons, a high roll generally means you succeed. In GURPS, which will be in many other ways familiar to a D&D player, a high roll generally means you fail. As my students might say when discussing rules of writing “Which is it? Why don’t the rules make sense?”

Rules in writing, like rules in games, are arbitrary, but they’re genre specific. So, we need to make sure that we are not carelessly suggesting, by omitting discussions of genre, that rules are universal. If we fail to do so, our students might just be playing Baseball by trying to score a touchdown.

Nostalgia and Christianity

Christianity shrouded in longing for the past isn’t very Christian.
Photo by the author, taken in a cemetery in Arkansas.

Although certainly many people who call themselves Christian today cling to the idea that the past was better than the present, I suggest that nostalgia is incompatible with Christian theology. Nostalgia requires a belief that things were better in the past than they are now or will be in the future, mistrusts youth and newness, and revels in pride in human accomplishments. At a certain level, it rejects God’s promise of a new creation and a new birth.

I recently read this article by Amy Hale on how neopaganism’s aesthetics align with fascism’s aesthetics. One of the things that stood out to me was the emphasis on a yearning to return to a past “golden age.” Of course, as a Christian, I’m painfully aware that, like the pagans invoked in Hale’s piece, my own faith has been used a banner for fascism and the alt-right, a veil for racism and evil, for a very long time. Probably longer.

Most people, when they hear “Christian,” think of conservative evangelical Christianity–the sort with the purity pledges, the conversion therapy, the charismatic preachers in “non-denominational” churches, and all that other stuff that has become a hallmark of Conservative politics in the United States. But I make little secret in my feelings that that culture is not really biblical or theologically sound; the more I learn about the history of Christianity, the more I am certain of that. But it definitely is this conservative culture that is used as an excuse for so much evil and hatred, and it definitely needs to be addressed. Just as Hale calls for pagans to address the parts of their theology and culture that allow fascism and racism to thrive, so too do I want Christians to do the same soul-searching and pruning.

To that end, as I sat in church today listening to a pretty good sermon on 1 Timothy 6:6-19 and the nature of greed that wasn’t really related to what I was actually meditating on (a fairly common occurrence for me–sorry, pastors), it seemed to me that nostalgia is a core aspect of conservative culture–one that is fairly incompatible in many ways to the theology of hope that Christ calls us to.

Conservatives (at least in the US, although it’s a feature of fascism and other similar movements in other places) generally mistrust youth unless they’re being obedient to “tradition.” They generally are in love with a past that didn’t really exist–that’s more or less the definition of nostalgia. They see the past as a golden age, as the way they remember things having been as the way they should be, and want to (if possible) preserve or restore things to that past. We look at the campaign slogan “make America great again” and see that view encapsulated right there: America was great, is no longer great, but maybe if we repress the youth and deny their ideas, we can make it great again. Inherent in this particular brand of nostalgia is the notion that things are generally decaying, and that the present is bad and the past is good and the only hope is to return to the past. It’s also a fear that what was valuable has been, is being, and will be lost (unless someone somehow stops time moving forward).

This nostalgia is not compatible with Christian teaching (or shouldn’t be). Christianity posits the past as lost, dark, sinful, and brutal, while offering the promise of a future that is bright, redeemed, and gentle. Moreover, it is a future that will be that way not because of humanity’s past, but in spite of it. As Christians, we look forward to the coming Kingdom of God. When we look backward, we see only a confused past, one in which humans have erroneously and repeatedly tried to obtain salvation and eternity through their own power, often resulting not in mercy or grace, but in pain and suffering. Not because humans were wrong to try, but because humans are really bad at being humane, life is complex, and even our best efforts fall short.

When we survey scripture, reading the Old Testament, we are faced with one tale after another of great forebears who nevertheless fall short of God’s perfection in a thousand ways. We see Noah, after the flood, getting embarrassingly drunk and cursing his own son over it. We see Abraham lying to one ruler after another about being married to his own wife because he fails to trust God’s protection, and we see them abusing their servant Hagar in their fear that God’s promise isn’t sufficient. We see David (God’s annointed! The shepherd king!) arranging to have one of his loyal men unjustly killed just so he can steal his wife. The list could go on and on. This is not a great, bright golden age to long for wistfully and emulate to bring it back. This is a neverending cycle of striving to be better and failing miserably, falling into the worst tendencies of humanity time and again.

When we yearn to keep things the same or to restore them to the past, we yearn to keep Christ nailed to the cross. But when we trust in hope, we trust in His resurrection.
Photo by the author, taken in a cemetery in Illinois.

Even when we read the New Testament, we are primarily faced with humans who are not so much exemplars as they are just more humans, trying to understand and fathom the depths of God’s grace and often failing anyway, but at least here it’s ok because we know that God’s redemption for them has been accomplished already.

There is, for Christians, no true “golden age” to yearn for, although I acknowledge that many theologians and pastors have certainly constructed one for themselves and taught about it. With no glorious past to remember, nostalgia becomes empty and hollow.

Likewise, nostalgia rejects the contributions of the young and mistrusts anything that is new. But Christianity embraces the teachings of the young, acknowledging that they might be filled with the Spirit as much as anyone else, and urges its adherents to look to the new, not the old.

Right now, the conservatives, many of them claiming to be Christians, are attacking children who have only dared to suggest that we might do better in being good stewards of God’s creation. Children.

But to do so rejects Jesus. Jesus came as a child, and although He didn’t start officially teaching until he was 30 or so, that was still considered quite young for a teacher (for reference, I’m often mistaken for the undergraduates I teach, I’m a much-maligned millennial, and I’m 32!). Furthermore, Jesus rebuked the older, wizened teachers around him and instead said “let the little children come to me” and admonished that no one become a stumbling block for the “little ones.”

And if you want further proof that God values the voices of the young upsetting the status quo, look at how often in the Old Testament a younger child is favored over an older child, who culturally should have had the authority.

Nostalgia suggests that the past that those who are old enough to have memories of another time was better, and that children “these days” just don’t understand, are “missing out,” or are somehow inferior and living in an inferior time. But in Christianity, we are told we must be “born again”–to become as children again, not looking past through the lens of a wizened old generation, but forward with the hope and trust of a little child.

Trust children when they teach.
Photo by Samantha Sophia via Stocksnap

We must not denigrate the present and coming generations; we must listen to them, for did not Jesus quote scripture saying that God might speak through children? Instead, we must trust that their hope is our hope, and that every member of the Body of Christ has value and is filled with the Spirit.

Finally, nostalgia is, like despair and other sins, at its root a kind of pride, a pride that puts our own achievements ahead of God’s plan. Nostalgia requires ruminating on past glories and our own works; it is comforting to remember the things that we have done that have made us, of our own power, seem great. This is the temptation–and one I will acknowledge, that I fall prey to often too. We look back at our “glory days” and say “things were better then.” Nostalgia whispers to us, when we complain of “kids today” and boast of our own golden age, that we alone were great, and no one after us will be better, except that they do as we had done.

But that is all pride and hubris. That is all vanity, echoing Ecclesiastes. Rather than trusting in the glory of our own past and achievements, we are called to look forward to the greater glories that will be as God’s plan unfurls and we are led forward by the Spirit. Paul writes that he had done the things that might make him considered “great” in his own life, but that he counts “them all as rubbish” (Philippians 3:8) in comparison to the greater glory that is God’s promise of salvation in Christ. He then urges his readers to not look back, but to look forward and press on to the goal.

I am not saying that occasional reminiscing is a bad thing. Memory is a gift of God and there is research showing that the occasional nostalgic session can be restorative and beneficial for our minds, spirits, and bodies.

But to cling to nostalgia is to look back, not forward. To cling to nostalgia is not to trust the Holy Ghost, but to long for human works and achievements. When we cling to nostalgia, we trust in ourselves, not our Lord. Nostalgia says that the world has decayed and will continue to decay, rather than trusting that it has been redeemed (not condemned!) and will be transformed into a new creation in Christ. Clinging to nostalgia denies that God is at work still and fails to trust in God’s plan.

Therefore, do not cling to the past, and do not let nostalgic narratives of a lost great golden age turn you away from the very real and present work of loving our neighbors and trusting in hope and God.

My Problem With National Novel Writing Month

I love National Novel Writing Month! It’s fall, so it’s time to start thinking about what I’ll write in November! (so far I’m as far as “no idea,” “probably fantasy but idk,” and “definitely not my main fantasy series”)

NaNoWriMo participant badge from 2017
Image courtesy of National Novel Writing Month

I’ve done NaNoWriMo every year since 2005. I’ve won NaNoWriMo every year since 2005. That’s at least 50k (one year it was 100!) every november since 2005. That’s… a lot of words. And that’s not even counting all the non-November NaNo style events I’ve done in the meantime (my record for those is… not a great. I don’t know what it is about November that works better for me).

NaNoWriMo is popular. According to the official website for National Novel Writing Month, there were over 307,000 writers participating in 2017. Moreover, it is increasingly being used as a tool for teaching early drafting strategies, narrative structure, and writerly discipline in schools of all levels. The same data claim over 95,000 participants in the Young Writers Program, which is a scaled down NaNoWriMo challenge for writers in K-12 grades. Many of these participants are engaging in the program at their teacher’s behest, either through simple encouragement or through structured, graded classroom activity.

Obviously, we need to know if this is working. And given that NaNoWriMo turns 20 this year, we probably should know by now, shouldn’t we?

I have attempted conducting formal research on NaNoWriMo several times since 2012. As a teacher of writing, albeit primarily academic writing, I’m keenly interested in knowing the effects of participating in NaNoWriMo on developing writers, especially given its popularity. I say attempted, though, because despite having been able to secure IRB approval and participants several times, my data has been… not very useful. It’s been small samples, sometimes compromised by timing, and otherwise not really worth writing home about (or, you know, writing journals about).

This is the data that got me curious. Yes, it’s a little old–but what is going on with the plateau of winning?
Image by the author.

This is not my problem alone. There is very little objective research on NaNoWriMo available. We have, of course, the nonprofit organization’s own annual reports, which tell a very interesting story of growth in participants, but a curious plateau of winning recently. And we have just one peer-reviewed study that I can locate that does indicate that the write-in model of writing often used by NaNoWriMo participants is especially effective at boosting word count, in comparison to simply doing NaNoWriMo without a write-in. (Watson, 2012) (If I am missing any significant studies, please comment and let me know!)

Most published material on NaNoWriMo in the scholarship seems to be personal anecdotes, such as Larry Burton’s “Lessons from NaNoWriMo” (Burton, 2009). And while there are now some naysayers about the event speaking out, such as this by Angus Kidman, the majority of personal narratives regarding NaNoWriMo (including the naysayers) seem to be fairly positive.

In fact, although I admit that this is largely a sense I’ve gotten overall and I haven’t done the specific discourse analysis (yet!), many of these personal narratives take on a sort of evangelical tone, resembling in some way religious conversion narratives. This may account for what I suspect is the largest barrier to understanding if NaNoWriMo does net good for writing students, or if there may be some harm in it. That barrier is:

What happens to the people who disappear?

We know what happens to the people who try and win. They often go on to do it again, and they are often quite vocal about it. They often try to recruit others, too.

But generally, you start with a lot of interested people, and about halfway through the month, people start just disappearing. When I ran write-ins on campus last year, attendance dropped to less than half of initial participation by the end of the month. When I attempted a pre-test and post-test survey of participants recruited online through social media, I received about 20 initial participants, and only about 10 post-tests, all from people who won the event. What I was hoping for was to hear from people who didn’t.

So what happens to the people who write perhaps ten thousand words and disappear? We don’t know. They disappear. They don’t go around talking about their experience. Do they come out of it convinced they can’t write, they just aren’t writers, and should never write again? We don’t know.

We do know that NaNoWriMo definitely works well for some people, and those people should continue participating. But if, in fact, NaNoWriMo only benefits a certain kind of learner–likely those who already identify as writers–are we in fact doing harm to the other kinds of learners when we push it in the classroom?

We don’t know.

And that’s the problem.

Unfortunately, I’m not sure at this point how to design a study that can identify and describe the experiences of the people who disappear from the sample. I’m working on it. And I’d love to hear your suggestions or experiences if you have anything that might help!


List of scholarly sources cited:

Burton, L. (2009). Lessons From NaNoWriMo. Journal of Research on Christian Education : JRCE.18(1), 1–10. https://doi.org/10.1080/10656210902752006

Watson, A. P. (2012). NaNoWriMo in the AcadLib: A case study of national novel writing month activities in an academic library. Public Services Quarterly, 8(2), 136. doi:10.1080/15228959.2012.675273

Yes, I Took My Groom Dress Shopping

Here’s the scene: I walk into a bridal boutique that deals in new, used, and vintage gowns. It is beautifully crowded with lovely things. I’m greeted by two friendly ladies, and I explain that I’m looking for a wedding dress. Behind me, my fiance. This wasn’t the first time I’d taken him dress shopping.

Dresses on the rack.
Photo by the author.

“And this is the groom, we assume?” the ladies ask.

“Yes.”

“And you’re ok with…?” The question doesn’t even need to be finished.

“Yes. It’s his wedding too. And I’m going to be working on the dress at home–how wouldn’t he see it? I want his opinion. I don’t really have anyone else in this state, anyway.”

For the record, the shop was very welcoming. I wound up buying a dress.

But I want to talk about this assumption that the groom shouldn’t see “The Dress.”

Perhaps it worked in a time when the bride’s family was expected to cover the entirety of the wedding, and the bride could plan the wedding in secret that way. I don’t know, honestly, if it ever worked, really.

But I do know what the reality of now is. The reality of now is that my fiance and I are planning the wedding together. I know that I’m marrying him because I like to share things that make me happy with him–I don’t think I could not share a pretty dress with him.

A few months ago, I made an appointment at a national bridal wear chain and tried on dresses there. I knew I wanted to make my dress, but I wanted to try things on and see what the prices and styles were. And I wanted my fiance there; I was unbending on that. And it was a great decision, even if he was there only groom there. As it turns out, he has a great design sense, and it reminded me once again why I love him.

Why shouldn’t I want someone who thinks I’m beautiful helping me choose the dress that’s supposed to make me feel the most beautiful? Why shouldn’t I bring my partner when we’re making a major financial decision? Why would I keep a secret from the person I want to share my life with?

Seriously, I strongly recommend taking your intended with you when you go formal wear shopping. This is the person who thinks you’re beautiful even when you’re under the weather. This is the person you are planning to make major financial decisions with. This is the person you want to share good and bad news with first. This is the person you are trusting with your very life, not to mention your household and family. Why wouldn’t you want their input?

Bring your partner. They’re your best accessory.
Photo by the author.


Starting off with a big secret seems like it’s sort of antithetical to the goal of marriage, even if it is traditional to do so. Starting off with open communication about something that makes you feel vulnerable and emotional–even something as seemingly trivial as a dress–seems like a good foundation.

Sure, wedding planning may take more time if you always check in with your partner on every major decision. And maybe you don’t get that “first look photo.” But it sure feels less lonely this way.

One Easy Trick To Write More Words

Yeah, sorry for the clickbait headline again. I’m kind of enjoying it, though. But I promise I won’t make you read through thirty slides. I don’t do that.

Anyway, today I want to talk about the trick I’ve used to write pretty much anything I’ve ever finished: dissertation, novel drafts, articles, you name it.

Write-ins.

Or, as one of my colleagues calls it, “proximal writing.”

Dr. Cox’s actual, literal desk. While editing this post. Meta, right?
Photo by the author.

I don’t know why it works, but it works this way: You make a commitment with some group of people (it can be as small as just one other person) to meet at a certain time and place, and at the beginning of the session you declare a goal. Then you more or less ignore each other and work!

If you are getting distracted, combine it with the pomodoro technique, in which you set a timer for a small amount of time (I usually use 10 minutes for first drafts, 15 or 20 for more complex things). I learned this as “word wars” from doing National Novel Writing Month–in fact, I learned the whole concept of write-ins from National Novel Writing Month.

I am, as of writing this, freaking out about being innundated with work. (Seriously, my thought process was: what is the easiest post I can write right now and check this off the list). But, I’m pleased to say that I wrote 700 words on a novel this week. Last week, it was 1,000. That’s… not zero. And that’s saying something. Likewise, I’ve already put more work into cleaning up some data into a presentable article this semester (in week 5) than I did for the entire previous semester. How?

Once a week, per project, I meet with some people and we declare our goals, write, and then declare how close we made it to those goals.

Moreover, I really treasure that time because at least at those times I know what I’m supposed to be doing. They’re anti-stress times even though they’re work times.

How do you make time to write? Have you tried “write-ins” or “proximal writing”? Do you have any other ways of making it a more productive time?

Easy Meringue Cookie Recipe

I’m a little inundated with work this week, but I did get in a little baking over the weekend (for a charity bake sale fundraiser, or I probably wouldn’t have gotten around to it). So I thought I’d share the recipe for my all-time favorite cookies!

These are not the actual cookies, but the swirl of the frosting here looks a little like them. I forgot to take a photo while I was making them. Sorry!
Image by Lukas via StockSnap

This recipe takes a little practice to get just right, but it’s super cheap, and great if you’re low on ingredients and need to make some cookies.

Be warned, though, it’s not a fast recipe. Prep time is short(ish), but baking time is looooooooong (we’re talking like 4 hours). If you need cookies quickly, make chocolate chip cookies instead.

Equipment:

  • Glass or metal mixing bowl (do not use plastic!)
  • A large decorating tip and decorating bag
  • Metal spoon and/or silicone spatula
  • Hand or stand mixer
  • Cookie sheets lined in parchment paper (paper bag paper can also work!)
  • Oven (225F)

Ingredients:

  • 2 egg whites (you need to separate them yourself; don’t use purchased egg whites, because these are pasteurized and won’t fluff up right)
  • 1/4 tsp cream of tartar
  • 1/4 tsp salt
  • 1/2 tsp vanilla extract
  • 1/2 cup sugar

Directions:

  • Preheat the oven to 225 F. This won’t take very long 🙂
  • Prepare your other equipment. Make sure your bowl and all utensils are COMPLETELY grease-free. I often hand-wash with Dawn soap in this step just to make absolutely sure. Ready your decorating bag; there won’t be much time to do this later. Fold down the upper edge for easier loading.
  • Separate your eggs and put the egg whites in the glass/metal mixing bowl and let them come close to room temperature. I usually mix the yolks in with another egg or so and have an omelet, fried rice, or something else that can use some beaten eggs.
  • Add the cream of tartar, salt, and vanilla to the egg whites
  • Beat the egg whites until still peaks form. Not soft, but stiff. Your beaters should leave interesting ridges in the mixture and the ridges don’t soften immediately.
  • Gradually beat in the sugar, and then continue beating until you’re back to stiff peaks. You’ll need to work quickly now.
  • Using a spoon or spatula, gently and quickly put the entire mixture into your decorating bag. You may need to stop the opening of the tip while you work. Close up the top of the bag (just a twist should work but some people like to clip it).
  • Pipe pretty cookie shapes onto the parchment paper. You can do shells, rosettes, or drops, for instance. Bear in mind these don’t really rise, so whatever you pipe is what you get. Have fun!
  • Bake the cookies in the oven at 225 F for 1.5 hours.
  • When the 1.5 hours is over, turn off the oven and open it a crack. This lets the cookies cool and dry slowly and results in better quality cookies.
  • Your cookies are done when the oven is at room temperature again.

Store your cookies in a dry place; moisture is the enemy of these cookies. But they are amazing–they just melt in your mouth and they have a very satisfying crunch.

Feel free to play with food dyes or other flavorings than vanilla, too. This is the most basic form of the recipe, but as long as whatever you add doesn’t have any kind of grease in it (use alcohol based flavorings, for instance), it should work.

The Walled Garden

One day in graduate school, I told to my thesis adviser (and later dissertation adviser) that I felt like I was being kept in a walled garden in the English department.

Your local English department, complete with English ivy.
Photo by Little Visuals via StockSnap

English as a discipline had appealed to me initially because it’s so diverse in their topics and methods of study. Here, I could borrow methods freely from nearly any field. Here, I was allowed to study nearly any medium. Here, I was allowed to look at any subsection of culture I could ferret out. Here I could do a project on, say, 19th century fiction, and then the next project on, say, the use of National Novel Writing Month as a pedagogical approach, and then another project on the representation of medieval cultures in video games, etc.

But when I was an undergraduate, I thrived on being able to take classes in any discipline. I took my statistics class through the philosophy department. I took on two minors. I delighted in the diverse core that was required of me.

But in graduate school, I was only taking classes in my own department, and I felt like that was what was expected of me. I loved my classes–everything I was learning was delightful and beautiful and useful. But I felt trapped.

There’s something I might need just over the other side of the wall, and I can’t even see what it is.

“I feel like I’m in a walled garden,” I said*. “Everything is pleasant inside, but I feel like there’s something I might need just over the other side of the wall, and I can’t even see what it is, much less get to it.”

“Then,” he told me, “take something outside the department.”

It was, dear reader, excellent advice. I started taking things with the education department, for instance. Did you know they teach basically the same things? Probably not, because, well, departments don’t talk to each other very well at all in general.

Departments–and by extension–become basically walled gardens, with the occasional enterprising pollinator managing to fly over the wall.

And it becomes painfully obvious when you go to conferences, and you encounter papers claiming to be breaking new ground–ground that you were reading about in interdisciplinary fields (such as game studies or medieval studies) for nearly a decade.

How many times do we have to invent the wheel before we know that wheels can turn?

My notes from one conference, where I attended nearly every panel on games and/or gamification I could find in the program, mostly read in large letters: “How many times do we have to invent the wheel before we know that wheels can turn?” This was, mind you, a conference for those in the field of English, not for those in game studies–who, of course, would be having much more dynamic discussions rather than trying to break ground on a thousand new foundations, but who are also vociferously interdisciplinary, generally not having departments to call their own.

Maybe if we can at least see through the fence?
Photo by Chris HP via StockSnap

I was reminded of this recently when I was looking through material on gamification in order to make sense of some rather surprising data that I’ve been working with (more on that in future posts). Almost every discipline is talking about gamification, but almost every discipline is largely only citing their own works on it.

The walled garden is pleasant and familiar. It’s a nice garden. But it’s keeping us from seeing how the plants in our garden are like the plants in other gardens and, perhaps more importantly, what other plants might thrive in our own gardens.

When that same adviser asked me what kind of institution I would like to work at, some years later, I answered without hesitation: “One without an English department.”**

I don’t want to be trapped in a walled garden, even if it is a nice garden. I want to be in an open field, where pollinators, seeds, and critters can all come and go freely.

But for that to happen, maybe we all need to be focused less on publishing and more on listening. Less on tending our own gardens and more on appreciating other gardens that are just as nice.

*Note that this is a dramatic recreation of what was said. I have no memory if this is actually what I said. My memory kind of sucks, ok?

**Yes, I know I currently work in an English department. Yes, it’s a very good English department, one I think is exemplary for other departments in a lot of ways, a department which I will happily advocate for in most circumstances. But I still think it’s a little redundant for us to have an excellent English department, an excellent communications department, etc. etc.

My Cat’s Too Smart

This is Legend. She hunts.
All photos by the author for this post

Let me tell you a story about my cat Legend.

She caught a mouse. No idea where the mouse came from, but my house isn’t exactly airtight, and we’ve had mouse problems around the house before.

The face of retirement.

I have two cats, though, so it shouldn’t be a problem, right? One of them is a former feral tom (now thoroughly retired, mind you), so he should know his way around a hunt. But he’s actually pretty bad at hunting these days.

The little one, despite being a scaredy cat in nearly every way, is a surprisingly good huntress.

I wasn’t home at the time, so I just got some frantic messages from my fiance about Legend catching the mouse.

Catching it wasn’t the problem. The problem is that she doesn’t know what to do with it when she’s caught it. She’s very good at catching things. But then she lets them go so she can keep playing.

Legend, left, is ready to hunt (but not kill!). Porch Cat, right, will probably kill you if you disturb his nap, but that’s about it.

Reader, she let the mouse go.

Later that day I come home wondering if there’s still a mouse running around the house. So I say to my two cats, knowing that they’ll do ANYTHING for a Squeeze-Up treat (this is how I convince them to let me trim their claws), “A Squeeze-Up to the first one who brings me a mouse!”

Legend is on it and starts diving into her hiding spots. She brings me something gray and fuzzy–and for a moment I’m terrified that she’s actually done it.

Reader, this was the “mouse”:

That ominous shadow is Legend. She heard me put the toy on the ground. She’s ready to hunt again!

“No, that doesn’t count,” I tell her.

A few minutes later, she brings me another “mouse”

No, your pink plushy mouse doesn’t count either.

I guess I needed to be more specific.

They and We: Ways We Talk About Students

One of the most important principles in my pedagogy is respect for students. Students are not a problem to be solved; they are complex human beings whom we are serving through pedagogy. It is not our job to impress upon them our own ways, but rather our job is to support them in becoming who they want to become using the wisdom we have from experience and training.

Lately I’ve been thinking–even in that last paragraph I just wrote–about the ways we talk about students when we talk about students without them present. Notice the division–We and They.

I’ve seen some people talking about the problem of referring to adult students–that is, most college students–as “my kids.” And I agree there’s some problems with saying “kids”, even though I admit I fall into this trap frequently myself. Generally I think it’s meant affectionately, even as I concede that it infantilizes capable adults, but sometimes it’s meant curmudgeonly, as in when you hear faculty complaining “Kids these days can’t…” (spoiler alert: they probably can).

But somewhat more subtle is this we/they distinction–and because it’s more subtle, it’s probably harder to address or solve.

We sit in our offices and plan our lesson plans: “I will have them do…” “They will…” But they don’t get any say in it. And that seems odd. It seems to treat students as servants and we the masters, when in reality we are the servants because our goal is not to improve our situations, but theirs. There’s something oddly colonial about it, as students become the mysterious Other, always “they,” collective and anonymous.

Students are hazy, faceless ghosts in our lesson plans
Photo by Alex Jones via StockSnap

In advice columns, whenever someone asks “How do I get them to understand?” the answer is always “You don’t. You can only control your own behavior.” So why do we, as teachers, behave like we can get them to understand?

I have no solutions or suggestions today. But it seems to me that learning is a collaborative enterprise. We say with ease the old familiar standby, that we learn more from our students than they learn from us. So why do we keep a wall (a desk?) between them and us? Why do we keep linguistic barriers between them and us?

The best solution I have is to see lesson planning as “We will…” but I don’t know how that might work. Perhaps a class in which, at the end of class, the whole class discusses together what the next goals should be?

What would a collaborative pedagogy of “we” not “they” look like?
Photo by Brodie Vissers via StockSnap

But these fancies seem incompatible with the realities of our syllabi, which must be completed absent the students and assume that all classes are interchangeable, and our bureaucracies that our syllabi feed.

How do you think we should be more respectful of students when discussing them when they’re not present? How do we make teaching more collaborative and less authoritarian?

One Big Reason Why Wedding Planning Sucks

I’m in the process of planning my wedding. There are some fun moments, but honestly? It kind of sucks. There’s a lot of sucky parts of wedding planning. There’s a whole tangle of toxic sexist assumptions to work with, there’s the ridiculous marketing, and a whole bunch of other problems. And I might talk about some of those here later. But today? Let’s talk about what I think may be the biggest reason why wedding planning sucks for most people:

Wedding dress shopping kinda sucks…
Image from the author’s collection

We don’t have enough big fancy parties.

But wait, you say, if planning a big fancy party sucks, why do you think the problem is that we don’t have enough of them?

Because I like wearing fancy dresses, that’s why 🙂

But in all seriousness, “because I like wearing fancy dresses” actually is a large part of why making fancy parties more normal for mainstream life would make planning a wedding suck less.

Wedding vendors will market to you by saying things like “This is the only chance you’ll have!” (One hopes) “If not for your wedding, when?” (Idk, Halloween?) or “You’ve waited your whole life for this!” (I have not). They’ll put a lot of pressure on the novelty of having a big party with all your friends and family where everyone dresses up fancy. And in some ways you’ll give into that pressure, because when else will you have a chance to be so fancy?

So, let’s imagine a world in which you take turns with your friends planning, say, four events a year–one for each season. Have a winter ball! Have a fall masquerade! Have a summer promenade! Have a spring gala! You get to plan one, a friend plans another, etc. Did your cat turn ten? Throw a soiree!

In my experience, movie characters do things like this (because Plot!). But most people don’t live in that world. It’s a world we fantasize about an awful lot, though.

So, how would making throwing fancy formal parties more often make wedding planning less painful?

To start with, it would bring down some of the cost of vendor services.

Vendors would need to charge less per event because they’d get more events overall. They’d also have more competition with a larger demand for their services, so consumers would be able to be choosy about services.

On the other side of the equation, we’d all have a better idea of what we want and how much it should cost if we had some more experience with event planning before planning a wedding.

We would also have a clearer idea of what works and doesn’t work in such events. Many of my decisions right now are guided by the experience of having been my sister’s de facto wedding planner a few years ago. But, as the first of my family to marry, she didn’t have that advantage.

Experience helps us make better decisions. But most of us who are planning weddings have almost none in a relevant area of expertise. We didn’t help our parents plan large parties as children, and we certainly aren’t planning formal parties with our friends.

Sketching out ideas…
Sketch by the author’s mother.

Finally, if we make formal parties more normal, we’ll all feel a lot more comfortable getting formal. There are two ways this works:

Firstly, we’d be more used to wearing formal wear, so we’d know how to wear it. The bridal industry can upsell us accessories all over the place, because what do we know about which ones we need? Much of my knowledge of what kinds of undergarments are required to get the figure I want on my dress comes not from experience with parties, but experience doing renaissance fairs and trying to achieve historical silhouettes. But when I went dress shopping (once with my sister, once for myself), I noticed both in the way the stylists at the bridal shops talked and the way the other customers behaved that, honestly, most of the women there had very little experience wearing formal gowns, and for most of them it was probably their first and last experience having something tailored to their body. How, then, should they know how they like things tailored? Or what accessories they definitely need day of?

Secondly, it puts less pressure on us to choose The Dress.

My biggest conflict right now in figuring out a wedding dress (which I’m probably making, btw) is that I want All The Things. But some of the things I want, such as various ways to do sleeves, are mutually exclusive. This impulse comes from the cultural notion that this is the only time I get to dress up like this.

But what if I knew that, in a year, I could plan on having another formal gown of my choosing?

Fabric shopping: one place I do have a lot of experience!
Image by the author

Then I might say “Well, for this dress, I want these sleeves. I can try the others another time.” I might even cycle through formal gowns in such a situation, relying on some favorites that I know are fabulous for me. (Actually I know this is true. I used about two dresses for every dance in high school)

The indecision comes from having too little experience, too many options, and too few chances mixed with too much social pressure. That kinda of high stakes game is a perfect formula for regret or mental paralysis.

I find myself fantasizing about having a glamorous job in the entertainment industry just so I can take the pressure off having to have just one perfect dress in my lifetime, but instead go to a few galas and awards ceremonies each year wearing nice dresses. No other reason. Just so I can say “Well, maybe I can use that idea later instead,” so that I can focus on good design rather than trying to have it all at once.

And historical views on wedding dresses, which were often simply just the nicest dress you had among your formal dresses, would confirm that more formal events makes wedding planning easier.

I understand some of the reasons–many of which are very good things!–why formal events have fallen by the wayside over the past century or so. Certainly removing a costly barrier to entry to social spaces is a good thing for opening up those social spaces, for instance. Although I like dressing up, I definitely agree with the newer “come as you are” approach to spaces such as church rather than the older “Sunday best” approach, in which these spaces were places to show off wealth and status. Likewise, removing the necessity to do formal wear for a lot of work-related or local social events helps remove some of the gender norms and make the space more inclusive in other ways.

However, dressing formally can be fun, and actually remove some of the stress of, say, gauging just how informal to be (if we set the dial to max, it’s hard to misjudge which medium setting to use). And in some ways, it can open up our spaces to more personal expression. We can still make these formal places welcoming places. Know a man who wants to come in heels and a sparkling slinky gown? Why not! A woman wants to wear a tux? Go ahead! We can play with themes, with seasonal activities, all the things.

Mostly, I want to have places to wear Nice Things. And I want other people to have Nice Things too.

Create your website with WordPress.com
Get started