Several journals have already reacted to the p value debate. For example, an ASQ essay provides suggestions that not only every editor should read. Another example are the policies published by SMJ: SMJ “will no longer accept papers for publication that report or refer to cut-off levels of statistical significance (p-values)”. Instead, “authors should report either standard errors or exact p-values (without asterisks) or both, and should interpret these values appropriately in the text”. “[T]he discussion could report confidence intervals, explain the standard errors and/or the probability of observing the results in the particular sample, and assess the implications for the research questions or hypotheses tested.” SMJ will also require authors to “explicitly discuss and interpret effect sizes of relevant estimated coefficients”. It might well be that we are currently observing the beginning of the end of null-hypothesis statistical tests. And it might only be a matter of time before other journals, also SCM journals, require authors to remove references to statistical significance and statistical hypothesis testing and, ultimately, to remove p values from their manuscripts.
The P value debate has revealed that hypothesis testing is in crisis – also in our discipline! But what should we do now? Nature recently asked influential statisticians to recommend one change to improve science. Here are five answers: (1) Adjust for human cognition: Data analysis is not purely computational – it is a human behavior. So, we need to prevent cognitive mistakes. (2) Abandon statistical significance: Academia seems to like “statistical significance”, but P value thresholds are too often abused to decide between “effect” (favored hypothesis) and “no effect” (null hypothesis). (3) State false-positive risk, too: What matters is the probability that a significant result turns out to be a false positive. (4) Share analysis plans and results: Techniques to avoid false positives are to pre-register analysis plans, and to share all data and results of all analyses as well as any relevant syntax or code. (5) Change norms from within: Funders, journal editors and leading researchers need to act. Otherwise, researchers will continue to re-use outdated methods, and reviewers will demand what has been demanded of them.
Leek, J., McShane, B.B., Gelman, A., Colquhoun, D., Nuijten, M.B. & Goodman, S.N. (2017). Five Ways to Fix Statistics. Nature, 551 (2), 557-559. DOI: 10.1038/d41586-017-07522-z
Academics and students often have very different ideas in mind when they talk about case study research. Indeed, case studies in SCM research are not alike and several different case study research designs can be distinguished. A recent article by Ridder (2017), titled The Theory Contribution of Case Study Research Designs, provides an overview of four common approaches. First, there is the “no theory first” type of case study design, which is closely connected to Eisenhardt’s methodological work. The second type of research design is about “gaps and holes”, following Yin’s guidelines. This type of case study design is what can be seen in SCM journals maybe most often. A third design deals with a “social construction of reality”, which is represented by Stake. Finally, the reason for case study research can also be to identify “anomalies”. A representative scholar of this approach is Burawoy. Each of these four approaches has its areas of application, but it is important to understand their unique ontological and epistomological assumptions. A very similar overview is provided by Welch et al. (2011).
Ridder, H.G. (2017). The Theory Contribution of Case Study Research Designs. Business Research, 10 (2), 281-305. DOI: 10.1007/s40685-017-0045-z
There has been a recent trend in several management disciplines, including supply chain management, to create knowledge by systematically reviewing available literature. So far, however, our discipline lacked a “gold standard” that guides researchers in this endeavor. The Journal of Supply Chain Management has now published our new article, Durach, Kembro & Wieland (2017): A New Paradigm for Systematic Literature Reviews in Supply Chain Management. Our systematic literature review process follows six steps: (1) develop an initial theoretical framework; (2) develop criteria for determining whether a publication can provide information regarding this framework; (3) identify literature through structured and rigorous searches; (4) conduct theoretically driven selection of literature and a relevance test; (5) develop two data extraction structures, integrate data to refine the theoretical framework, and develop narrative propositions; and (6) explain the refined framework and compare it to the initial assumptions. We believe that these best-practice guidelines, although developed for the SCM discipline, can be used as a blueprint also for adjacent management disciplines.
Durach, C.F., Kembro, J. & Wieland, A. (2017). A New Paradigm for Systematic Literature Reviews in Supply Chain Management. Journal of Supply Chain Management, 53 (4), 67-85. DOI: 10.1111/jscm.12145
“Scale purification” – the process of eliminating items from multi-item scales – is widespread in empirical research, but studies that critically examine the implications of this process are scarce. In our new article, titled Statistical and Judgmental Criteria for Scale Purification, we (1) discuss the methodological underpinning of scale purification, (2) critically analyze the current state of scale purification in supply chain management (SCM) research, and (3) provide suggestions for advancing the scale purification process. Our research highlights the need for rigorous scale purification decisions based on both statistical and judgmental criteria. We suggest several methodological improvements. Particularly, we present a framework to demonstrate that the justification for scale purification needs to be driven by reliability, validity and parsimony considerations, and that this justification needs to be based on both statistical and judgmental criteria. We believe that our framework and additional suggestions will help to advance the knowledge about scale purification in SCM and adjacent disciplines.
Wieland, A., Durach, C.F., Kembro, J. & Treiblmaier, H. (2017). Statistical and Judgmental Criteria for Scale Purification. Supply Chain Management: An International Journal, 22 (4). DOI: 10.1108/SCM-07-2016-0230
You should all read this interesting article: Approaching the Conceptual Leap in Qualitative Research by Klag & Langley (2013), which is useful for researchers who build theory from qualitative data. Its central message is “that the abductive process is constructed through the synthesis of opposites that [the authors] suggest will be manifested over time in a form of ‘bricolage’.” The authors use four dialectic tensions: deliberation—serendipity, engagement—detachment, knowing—not knowing, social connection—self-expression. One of the poles of each dialectic has a disciplining character, the other pole has a liberating influence: On the one hand, overemphasizing the disciplining poles “may result in becoming ‘bogged down’ in contrived frameworks (deliberation), obsessive coding (engagement), cognitive inertia (knowing) or collective orthodoxy (social connection)”. On the other hand, overemphasizing the liberating poles “can also be unproductive as researchers wait for lightning to strike (serendipity), forget the richness and nuances of their data (detachment), reinvent the wheel (not knowing) or drift off into groundless personal reflection (self-expression)”.
Klag, M., & Langley, A. (2013). Approaching the Conceptual Leap in Qualitative Research. International Journal of Management Reviews, 15 (2), 149-166 DOI: 10.1111/j.1468-2370.2012.00349.x
Like it or not: Our discipline is very much dominated by positivism and the application of the scientific method, which assumes that new knowledge can be created by developing and testing theory or, in other words, by induction or deduction. Another type of inference is abduction. Spens & Kovács (2006) present an overview of the deductive, inductive and abductive research processes.
Spens, K., & Kovács, G. (2006). A Content Analysis of Research Approaches in Logistics Research. International Journal of Physical Distribution & Logistics Management, 36 (5), 374-390 https://doi.org/10.1108/09600030610676259