Back in February, I analyzed WordPress’s automated grammar checker, After the Deadline, by running some famous and well-regarded pieces of prose through it. I found the program lacking. What I wrote was:
If you have understood this article so far, you already know more about writing than After the Deadline. It will not improve your writing. It will most likely make it worse. Contrary to what is claimed on its homepage, you will not write better and you will spend more time editing.
I think my test of After the Deadline proved its inefficiency, especially since I noticed that the program finds errors in its own suggestions. Talk about needing to heed your own advice…
A comment by one of the program’s developers, Raphael Mudge, however, got me thinking about what benefit (if any) automatic grammar checkers can offer. Mr. Mudge noted that the program was written for bloggers so running famous prose through it was not fair. He is right about that, but as I replied, the problem with automated grammar checkers really lies with the confidence and capability of writers who use them:
[The effect that computer grammar checkers could have on uncertain writers] is even more important when we think of running After the Deadline against a random sample of blog posts, as you suggest. While that would be more fair than what I did, it wouldn’t necessarily tell us anything. What’s needed is a second step of deciding which editing suggestions will be accepted. If we accept only the correct suggestions, we assume an extremely capable author who is therefore not in need of the program. As the threshold for our accepted suggestions lowers, however, we will begin to see a muddying of the waters – the more poorly written posts will be made better, but the more well written posts will be made worse. The question then becomes where do we draw the line on acceptions to ensure that the program is not doing more harm than good? That will decide the program’s worth, in my opinion.
As it turns out, after that review of After the Deadline, I was contacted by someone from Grammarly, another automated grammar checker. For some reason they wanted me to review their program. I said sure, I’d love to, and then I promptly did nothing. In truth, I was sidetracked by other things – kids, work, beer, school, the NHL playoffs, more beer, and recycling. So much for that.
Now R.L.G. over at the Economist’s Johnson blog has a post about these programs and a short discussion of Ben Yagoda’s review of Grammarly at Lingua Franca, a Chronicle of Higher Education blog. I want to quickly review these posts and add to my thoughts about these programs.
First, R.L.G. rightly points out that “computers can be very good at parsing natural language, finding determiners and noun phrases and verb phrases and organising them into trees.” I’m happy to agree with that. Part-of-speech taggers alone are amazing and they open up new ways of researching language. But, as he again rightly points out, “Online grammar coaches and style checkers will be snake oil for some time, precisely due to some of the things that separate formal and natural languages.”
Second, Mr. Yagoda’s review of Grammarly is spot on. (I’m impressed by how much he was able to do with only a five day trial. They gave me three months, Ben. Have your people call mine.) Not to take anything away from Mr. Yagoda, but reviewing these checkers is like shooting fish in a barrel because they’re pretty awful. A rudimentary understanding of writing is enough to protect you from their “corrections”. But it’s the lofty claims of these programs that makes testing them irresistible to people like Mr. Yagoda and myself.
So who uses automated grammar checkers and who could possibly benefit from them? The answer takes us back to the confidence of writers. Obviously, writers like RLG and Ben Yagoda are out of the question. As I noted in my comment to Mr. Mudge, the developer of After the Deadline, “a confident writer doesn’t need computer grammar checkers for a variety of reasons, so it’s the uncertain writers that matter. They may have perfect grammar, but be lead astray by a computer grammar checker.” It’s even worse if we take into account Mr. Yagoda’s point that “when it comes to computer programs evaluating prose, the cards never tell the truth.”
We do not have computers that can edit prose, not even close. What we have right now are inadequate grammar checkers that may be doing more harm than good since the suggestions they make are either useless or flat out wrong. They are also being peddled to writers who may not be able to recognize how bad they are. So there’s a danger that competent but insecure writers will follow the program’s misguided attempts to improve their prose.
It’s strange that Grammarly would ask Mr. Yagoda or myself to review their program since Mr. Yagoda is clearly immune to the program’s snake oil charm and I wasn’t exactly kind to After the Deadline. But such bad business decisions might prove helpful for everyone. Respected writers will point out the inadequacy of these automatic grammar checkers, which will hopefully influence people to not use them. At the same time, until these programs can really prove their worth – or at least not make their inadequacy so glaringly obvious – they will not receive any good press from those who know how to write (nor will they get any from lowly bloggers like myself). In this case, any press is not good press since anyone reading R.L.G. or Ben Yagoda’s discussion of automated grammar checkers is unlikely to use one, especially if they have to pay for it.
[Update – Aug. 9, 2012] R.L.G. at Johnson, the Economist’s language blog that I linked to above, heard from Grammarly’s chief executive about what the program was meant for (“to proofread mainstream text like student papers, cover letters and proposals”). So he decided to put Grammarly through some more tests. Want to guess how it did? Check it.
Like this:
Like Loading...