Abstract
There are plenty of books and articles on research methods, but few discuss the nature and purpose of method sections in academic journals. Based on interviews with critical and interpretivist researchers, this short paper examines the nature and purpose of method sections in management and organization studies. We show how researchers make sense of, and struggle with, positivist expectations about the form and content of method sections. Ultimately, we call for greater openness about what method sections might look like and ask whether all academic articles need method sections.
Keywords
Introduction
“Method” etymologically means “path” (from the Greek meta, “in pursuit of” and hodos, “path, road”). In research, a method is commonly understood as a path to knowledge. On a basic level, the method section in an academic journal article tells you about the path that researchers have taken to get to the results on which the paper reports. Of all the different functions a method section has, two stand out. First, knowing the path helps readers (editors, reviewers, fellow academics) decide on whether or not they find the results of the study convincing. An unconvincing method section may result in methodological doubt—for instance, doubt that the path taken does indeed lead to the reported destination. Second, knowing the path enables other researchers to undertake the same journey. If researchers cannot reproduce the previous results with the same methods, then the original findings might be called into question.
Historically, method has been crucial for the social sciences. The social sciences draw on method for both practical reasons (knowing how to proceed in the research process) and for legitimacy reasons (becoming, or seeming, as rigorous as the hard sciences). After the Second World War, US-based social sciences became aligned with the tradition of logical positivism, in particular the idea of the unity of science (Steinmetz, 2005). On this view, the social sciences would be just as scientific as medicine or the natural sciences if they followed the same scientific method: the hypothetical-deductive method. This method prescribes that all research starts by identifying or formulating a theory. Hypotheses are then deduced from this theory, and subsequently these hypotheses are empirically tested by means of an experiment or observation.
In the positivist tradition, the method section in academic journals is fundamental to the claim that “this is science.” On an implicit level, the method section reinforces the idea of the unity of science—the notion that all true knowledge can be derived from the application of a universal method, or a set of shared protocols. The method section not only tells the reader what the research process looks like, but also signals to the reader that the contents of the paper reach the standard of “scientific knowledge,” as opposed to any other kind of unsubstantiated claim.
But what about the method section in the qualitative tradition in the social sciences, that is, research that falls outside of the hypothetico-deductive method? Typically, critical and interpretivist research criticizes the positivist claim about the unity of science. In general, such a strand of research would not consider the social sciences to be “scientific” per se. The hypothetico-deductive method is one way of doing research, but there are other ways of doing research that are equally legitimate. A number of journals encourage heterodox, non-positivist research in management and organization studies (hereafter “MOS”), and Organization is one of the foremost examples. In the inaugural issue of the journal, for example, Organization’s editors call for “new analytical narratives” that reflect the changing landscape of the academic field, one that is less monolithic and more diverse (Burrell et al., 1994: 6). These new analytical narratives include not only novel theoretical approaches to the question of organization, but also unconventional modes of methodological engagement. For the last 30 years, Organization has remained at the forefront of experimenting with the style and content of academic articles (Parker and Thomas, 2011; Pritchard and Benschop, 2018). This is partly evident in the range of submissions that Organization welcomes, from activist-adjacent contributions (“Acting Up”) to polemical essays (“Speaking Out”). But it is also evident in Organization’s outright rejection of the hypothetico-deductive orthodoxy in some of its more traditional journal articles.
In deviating from the positivist tradition, interpretivist and critical scholarship has needed to re-invent the function of the method section. Yet, as we will see, it has done so in ways that cause discomfort, skepticism, and frustration among researchers. This leads us to wonder whether MOS researchers are truly able to escape from the underlying assumptions of positivism, despite more than 30 years of attempts.
The question we want to address is this: what is the point of method sections? This question has particular resonance for critical and interpretivist scholars in the field of management and organization studies, a field dominated by positivist scholarship that relies on hypothesis-testing. As critical and interpretivist researchers ourselves, we have our own answer to this question. But we were also intrigued to find out how other scholars in the same tradition make sense of method sections. Toward this end, the second author of this paper interviewed 11 academics about the nature and purpose of method sections. Each respondent has extensive experience as an author, reviewer, and editor. Unstructured and conversational, the interviews took place in person during the Academy of Management annual conference in Boston in 2019 (except for one interview, which took place online). All names have been changed.
The purpose of method sections
The positivist approach weighs heavily on the content and form of method sections for interpretivist and critical research in MOS, but it does so in complex and contradictory ways.
Method sections serve a dual purpose. First of all, the method section ideally ensures transparency and accountability in the research process: The most basic level of methods section is sort of an account of how you did the research. . .It should serve a function of some degree of transparency. (Ben) I’m quite fine with [method sections] being very descriptive: ‘this is what we did, and then we did this, and then we did this’. . .I want transparency. (Leon)
Here, the method section is meant to provide an honest account of how the research process was conducted—a role similar to the method section in the positivist tradition. If the methods are fully known, then the findings can be subjected to a “quality check” (Nils). In other words, the method section provides an “audit trail” (Monica) that will tell the reader how the conclusions were arrived at and whether the researcher’s claims hold up to scrutiny.
At the same time, the method section also has a “legitimacy effect” (Ben) for critical and interpretivist scholarship. This type of research requires a well-crafted description of method because it might otherwise be seen as unsubstantiated knowledge, or pseudo-science, from the dominant positivist perspective. As Christian notes, “attempts at legitimating knowledge through claims of our science are kind of important because nobody really takes this stuff seriously” – “nobody,” of course, referring to positivist management researchers. In effect, the method section is a way of saying, “‘this is a legitimate piece of science showing these canonical process’” (Ben). On this view, method sections increase the chances that critical and interpretivist research will be accepted in the broader field of management and organization studies.
In the positivist tradition, one of the aims of the scientific method is to legitimize social science precisely as a science—that is, to reach the same methodological standard as the natural sciences. But critical and interpretivist scholarship, of course, makes no such claims. Writing about methodology, Easterby-Smith et al. (2008) argue that “transparency” in interpretivist research means something different than it does in the positivist tradition. In the interpretivist tradition, a discussion of methods ought to depict the “open-ended, iterative, and contingent process” of research (Easterby-Smith et al., 2008: 424). The aim isn’t to produce a roadmap for scientific replication; the aim is to paint a rich, multifaceted portrait of what the research process looked like. The method section, in other words, ought to be just as messy as the social world it seeks to analyze.
In interpretivist research, the notion of “abduction” is sometimes used to capture the complex back-and-forth between theoretical considerations and empirical findings (Schwartz-Shea and Yanow, 2013). Such an approach stands in stark contrast to the neat linearity promised by both deductive and inductive methods. In practice, interpretivist research is even more chaotic and unpredictable than the notion of abduction suggests—it involves, we might say, finding a path through the mess. Given the nature of critical and interpretivist research, the key task is to capture this messiness in the method section of an academic article: An ideal methods section. . .should communicate everything, from the mess and the practicalities of collecting the data. (Sam) Methodology sections are really important in terms of surfacing the complex and inconsistent voices that we have within our work. (Bettina)
The implication, in these accounts, is that method sections ought to be upfront about the richness and complexity of the research process. Next, we will see why method sections often fail to capture the nuances of empirical work—and whether they are as transparent and reflexive as they claim.
An honesty problem
Method sections in interpretivist and critical research rarely give an accurate account of how the empirical work unfolded. Method sections typically present a “nice tidy account” (Nils) of research, even though the research process is usually “messy from beginning to end” (Nils). Sam elaborates: The kind of coffee and corridor conversations that we have about this or that research project that went completely pear-shaped and had to be re-designed in the middle or even at the end – [these are] very rarely reflected in journal paper methods sections.
In journal articles, MOS researchers present an all-too-smooth account of their research process, one that fails to capture the twists and turns our scholarship takes on the road to publication. Consequently, method sections frequently fall short of transparency and accountability. Some of our respondents are troubled by this dishonesty: Qualitative methods sections are done very poorly. . .[We’re] lying about what it is that we do and pretending that we knew what we knew at the end [of the research process], pretending that we don’t work as iteratively as we do. (Rose) We have an honesty problem. . .We almost lie in the methods section to get published. . .The reviewer knows that you’re lying [and] you know that the reviewer knows that you’re lying, so it becomes a charade. (Leon)
A lie, by its very nature, is meant to deceive (it is not merely a statement that happens to be untrue). So the question is: who needs to be deceived, and why? On the face of it, the “lie” is directed at reviewers and editors—the gatekeepers who must be persuaded to accept the article as a legitimate contribution to the field. As Leon puts it, “We’re . . . doing methods sections to convince the reviewers, rather than to. . .tell the readers what happened.” A conventional method section, one that smooths out the rough edges, is simply “less risky for authors, less risky for reviewers, less risky for editors” (Nils). In other words, a method section that looks like any other method section is less likely to lead to the paper being rejected.
This is especially true for positivist-friendly journals like the Academy of Management Journal and other highly ranked, US-based (and US-centric) outlets. Editors and reviewers in such journals may incline toward a positivist presentation of methodology, even if the journal accepts more interpretivist research. In these journals, “the review process is likely to homogenize the methods section” (Sam) and result in a “formulaic” (Leon) description of the research process, mimicking the tone and style (if not the actual content) of positivist scholarship.
Yet even interpretivist MOS journals (like this one) suffer from the same honesty problem as their positivist counterparts. Here, the deception doesn’t arise from coating the method section with a surface-layer of scientific rigor. Rather, the deception arises from engaging in superficial kinds of methodological self-reflection: You need to show that you’ve been reflexive. . .[but] a lot of methods sections that I’ve been reading recently show they’re reflexive by saying, ‘We have been reflexive’. . .which is not ‘reflexivity’ as I understand it. (Sam)
Treated as an obligatory passage point, reflexivity becomes little more than “virtue signaling” (Ben) or an empty “incantation of privilege” (Christian). Adding a dollop of reflexivity to the method section doesn’t necessarily result in a more transparent account of the research process; it may serve as another way to convince editors and reviewers that you are ticking the right boxes. To this extent, adding a few self-reflective remarks in a method section is the academic equivalent of “checking your hair in the mirror before you go out” (Christian).
The grip of positivism in MOS research has certainly loosened. But still, research in the critical and interpretivist tradition continues to churn out method sections that follow a familiar template, leading to scholarly discussions that are at worst misleading and at best plain dull. We now turn to possible ways forward.
The making of. . .
“What is the point of method sections?” is perhaps the wrong question to ask. It assumes that method sections, in their current form, exist for good reason in critical and interpretivist scholarship. But we’re not convinced that this is in fact the case. Method sections carry a lot of baggage with them, and this baggage contains positivist assumptions about the nature and purpose of research. So perhaps a better question for us to ask is this: how should we discuss our research methods and methodology, and where should this discussion take place?
One of our respondents makes an unfavorable comparison between the structure of journal articles and the structure of motion pictures: Imagine a film where you get. . .the biggest action scene of the whole film at the beginning, and then it sort of goes back to the start. . .and then suddenly a third of the way through the film you go into a twenty-minute section on ‘this is how this film was made,’ and then you go the big battle scene, and then to the denouement. . .That would just be the most unsatisfying film to watch in the world. (Ben)
The method section is an unwanted interruption of the story that the researcher is telling (“this is how the research was made”). We wouldn’t expect to find a making-of documentary in the middle of an action movie—that would ruin the flow and momentum of the narrative. So why, Ben wants to know, do we always find the method section sitting awkwardly in the middle of a journal article?
Of course we’re academics, not film-makers. But a comparison between the craft of scholarship and the craft of cinema is instructive because it compels us to think about why method sections take the form they do. If MOS researchers are truly concerned about giving a true and accurate account of their methods, then perhaps we also need to break out of the confines of the conventional method section.
Our respondents propose a couple of alternatives to conventional method sections. One suggestion is to incorporate a discussion of methods in the main body of the article. As Rose suggests, “the best [method sections] are absolutely integrated into the story.” This would allow academics to write in a way that captures the true “messiness” of the research process without paying mere lip-service to issues around transparency or reflexivity.
An example of a discipline where reflections on method are often integrated in the narrative is history. Historical scholarship rarely includes a stand-alone method section, which does not prevent historians from weaving methodological considerations—such as the challenges of archival research—into their empirical narratives. MOS researchers can learn a more eclectic way of approaching methodological questions from other disciplines, such as history, philosophy, ethnography, and media studies.
Another suggestion is to remove a discussion of methods from the article entirely and make it available elsewhere. To extend the metaphor of feature films, this might involve producing an academic equivalent of “DVD extras” or “Blu-ray special features”—additional material that can be accessed by curious readers. Nils proposes a “free and unencumbered methods section that can just be posted online. . .and that doesn’t have a word count on it,” so researchers can describe their research process without having to interrupt the empirical narrative.
A broader concern is how we describe our research process—that is, what our methodological reflections consist of. The exact content of our scholarly “making of” documentaries is by no means obvious. Take this paper as an example. The reader will notice that this paper does not have a method section, although we would have been more than capable of writing extensively about the gender composition of our respondents (seven men and four women, maybe you’d like to know more?), or about how we lost one interview because the second author’s recording device died in the rain (you probably don’t want to know the details). We could also have taken the opportunity to reassure you that we have been reflexive throughout the research process (with appropriate references, of course), and to inform you that one of the interviews took place in a bar (here’s an extract from the transcript: “Yes, I’d like a pale ale – what would you recommend?”). We might have continued by acknowledging the positionality of the authors (white middle-aged men with tenured positions at European universities), and how we reached saturation point with 11 interviews (that would be a lie, of course, but a lie we routinely accept). And so on.
We don’t want to suggest that all of the above is irrelevant to understanding the research process. But we do want to invite MOS researchers to think more about what we say about the research process, why we say it like we do, and whether the method section is really the best place for saying it. And we might go further in asking what is worth saying in the first place. For example, would this piece really be better if we had included a typical method section instead of a few lines here and there? We doubt that this is the case. A conventional method section may not add much to an article, and may in fact detract from its overall flow and coherence. If the aim is to stimulate thought, then we are unlikely to achieve this by giving a descriptive account of how one went about collecting and analyzing data. MOS scholars ought to make sure what they say about their methods is in the service of the paper itself, rather than in the service of reviewer demands and editorial expectations. There may be cases when a conventional method section is called for, but this should be the result of an informed choice—one guided by the needs of the paper—rather than the default position that we automatically adopt.
Footnotes
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
