As researchers, we hope that the quality of our research shines through when we make a presentation or submit an article for publication. However, how the quality of our research is perceived depends on the quality of our communication. I've been to a number of seminars and conference presentations where the underlying research might be good, but the presentation was so bad it was difficult to tell. And similar for research articles. As a journal reviewer, if I can't understand what has been done, then I'm much more likely to recommend that a submission be rejected.
So, quality of communication is important in research. But how important is it? That's the underlying question behind this new working paper by Jan Feld (Victoria University of Wellington), Corinna Lines, and Libby Ross (both plain language consultants at Write Limited). They undertook an experimental evaluation of writing quality of articles written by economics PhD students. As they explain:
To find the causal effect of academic writing, we need to compare well-written papers with poorly written papers that are otherwise identical. This is what we do in this study.
We estimate the causal effect of writing quality by comparing how experts judge the quality of 30 papers originally written by PhD students in economics. We had two versions of each paper: one original and one that had been language-edited. The language editing was done by two professional editors, who aimed to make the papers easier to read and understand. We then asked 18 writing experts and 30 economists to judge some of the original and edited papers. Each of these experts judged five papers in their original versions and five papers in their edited version, spending around 5 minutes per paper. None of the experts saw both versions of the same paper. None of the experts knew that some of the papers were edited. The writing experts judged the writing quality and the economists judged the academic quality of the papers.
Feld et al. emailed PhD students and their supervisors at all eight New Zealand universities and invited them to participate. I honestly don't remember this invitation, but it was in the middle of pandemic-induced online teaching, and so many things just passed me by. Anyway, they got a sample of 30 papers from 22 PhD students. The effect of the writing editing is substantial when the papers are evaluated by writing experts:
Writing experts judged the edited papers as 0.6 standard deviations (SD) better written overall (1.22 points on an 11-point scale). They further judged the language-edited papers as allowing the reader to find the key message more easily (0.58 SD), having fewer mistakes (0.67 SD), being easier to read (0.53 SD), and being more concise (0.50 SD).
Feld et al. asked 30 Australian economists to evaluate the New Zealand PhD students' papers (so I didn't miss an invite to be a reviewer, at least!). The economists were less swayed by writing quality than the writing experts were though:
Economists evaluated the edited versions as being 0.2 SD better overall (0.4 points on an 11-point scale). They were also 8.4 percentage points more likely to accept the paper for a conference, and were 4.1 percentage points more likely to believe that the paper would get published in a good economics journal.
Nevertheless, those results are statistically significant. Feld et al. also found statistically significant decrease in economists' view of the probability that they would desk-reject the paper, and a marginally significant increase in the perception of the writing quality (as opposed to the paper quality, in the above quote). And in case you were worried that these results are purely subjective, Feld et al. include an objective measure (the Flesch-Kincaid readability score), and find that:
The language editing also affected the readability as measured by the Flesch-Kincaid grade level score. The introductions of edited papers have a readability score corresponding to grade level 14.7, compared to 15.3 of the introductions of original papers. This improvement of 0.6 grade-levels is statistically significant at the 1 percent level. For comparison, our introduction has a Flesch-Kincaid grade level score of 12.5.
As a matter of interest, the first paragraph of this post rates a grade level score of 10.8 (according to this online calculator). The whole post (excluding the indented quotations from the paper) rates a grade level score of 11.5.
Importantly, when Feld et al. stratify their analysis into well-written (top half of the original unedited papers, as assessed by the writing experts) and poorly-written papers (bottom half of the original unedited papers), the results are what you would expect. For poorly written papers:
All point estimates are statistically significant and large. For example, poorly written papers that have been language edited are judged 1.05 SD better written than poorly written papers in their original version.
That was the effect on the writing experts' judgment of quality. The economists rated the quality of the edited papers 0.29 standard deviations higher than the unedited papers (and 0.4 standard deviations higher in terms of writing quality). There were no statistically significant effects of the editing on the economists' evaluations of well-written papers.
So, writing quality matters. But my takeaway from the paper is that writing quality matters differently depending on the audience. If economists are trying to appeal to non-specialists, then writing quality matters a lot, and it matters both for high-quality and low-quality research. There is potentially a strong case to be made for economists (and no doubt, other researchers as well) to train in science communication. If economists can only speak to other economists, then the impact of our research on policy and practice is going to be somewhat limited.
On the other hand, if economists are trying to appeal only to other economists (rather than to professional writers or other non-specialists), then only those below-average writing quality seem to benefit from professional editing. This is an important finding for PhD students (who were the study population in this study). PhD students are typically trying to get their papers accepted for publication in economics journals, and will likely have economists as their thesis examiners. In those contexts, professionally edited writing is only of benefit for low-quality papers. For high-quality papers, the quality of the research is more apparent to economist readers. However, in the New Zealand context, so many of our PhD students are international students, with English as a second (or third or fourth) language.
It seems to me that there are definitely gains to be had for many students in having their PhD chapters (or articles) professionally edited. The challenge may be in whether the PhD regulations allow it. For instance, the University of Waikato guidelines for proof-reading of theses suggest that professional editing is not allowed. It seems that, if any editing is to be done, it falls on the academic supervisors to do. Perhaps these guidelines need reconsideration, given the potential gains in clarity of argument to be had, with no changes in substantive content (after all, the editors in this case were not academic experts). Students (and thesis examiners) would potentially benefit greatly from this.
[HT: Marginal Revolution]
No comments:
Post a Comment