Read the Editorial by the new Scandinavian Journal of Statistics editors, Sangita Kulathinal, Jaakko Peltonen and Mikko J. Sillanpää, from the first issue of 2022.
The Scandinavian Journal of Statistics (SJS) was founded in 1974 and has since evolved with the field of statistics. The process of changes over time has made SJS a truly international statistical journal, has attracted submissions from worldwide, and has provided valuable research articles to the global statistics community. The tradition of SJS’s publication process involves a thorough review of each submitted manuscript. The manuscripts are first reviewed by the editors, after which they are reviewed by an associate editor with expertise related to the manuscript and by subject matter experts as reviewers. The board of associate editors consists of dedicated international experts in their respective fields covering a wide range of statistics research areas and expertise. This publication process assures publication of high-quality research articles.
We undertook editorial duties concerning new submissions already in July 2021. We had great pleasure in learning the day-to-day functioning of SJS from the outgoing editors Håkon K. Gjessing and Hans J. Skaug. Such interactions, and the Wiley support team, have made the start of this journey smooth. We would like to thank them and hope that they will continue to extend their valuable guidance to us.
We are committed to maintaining the current strong status of SJS. Our wish as editors is to increase the visibility and impact of SJS even further. We encourage and value high-quality submissions equally from Scandinavia and Nordic countries as well as from elsewhere.
We will constantly strive to meet the diverse and ever growing field of statistics by attracting research works, which balance the theory and innovative applications. All types of submissions focusing on methodological and theoretical development, and applied statistical work having strong statistical methodology are welcomed. We expect to also feature special issues or special sections on timely topics.
We will continue to follow the strong tradition of the SJS publication process. The world-class expertise of our associate editors is the key component in ensuring the high quality of SJS research articles. We thank the associate editors for their excellent and continuing service to SJS. We are continuously monitoring the need for new expertise, and will invite new associate editors as required. We already invited seven new associate editors who joined the board in 2021, and would like to extend them a warm welcome. Moreover, numerous expert reviewers have provided an invaluable contribution to the quality of the journal and we are ever in debt to them for their commitment.
Since the 18th century, modern statistics has evolved from a collection of ad hoc methods of handling data and assorted mathematical tools. In the 21st century, we still ask questions similar to the ones that Daniel Bernoulli asked in the 1700s related to vaccination against smallpox, and use mathematical and statistical modeling to answer them. Statistics has provided key tools for managing belief and uncertainty: one’s own prior belief on the question at hand cannot be taken as the ultimate truth but needs to be updated when new information comes in. This updating process synthesizes the prior beliefs with new observations to yield statistical evidence. Even after all available information has been taken into account uncertainty remains and hence, the outcomes of potential actions taken should have probabilities attached to them. In the early 1900s, Karl Pearson had already emphasized methods of measuring and expressing uncertainty in conclusions.
With the fast advancement in measurement, sensor and instrumentation techniques, prodigious amounts of data, both in volume and in variety, have been generated. The rise of big data further necessitates statistical methods with new characteristics and has shifted focus of current statistical development with increased emphasis on computational aspects. Namely, scalability and simultaneous integration of information from multiple data sources in complex joint models have become more and more important in modern statistics. Moreover, new areas of applications are emerging, and with that a demand to develop sound statistical methods.
As a consequence of a new wave and hype around methods in artificial intelligence (AI), their use, especially in nonlinear predictive problems, has increased substantially. The availability of massive data and development of flexible computational approaches have not eliminated the need for proper statistical treatment of uncertainty and generalisation. Rather, these developments have brought a need for statistical modeling in ever higher dimensional spaces and with more detailed models, supporting an increasing variety of predictive tasks. Statistical approaches in machine learning and data analytics benefit from developments in the underlying statistical machinery and theory, but also require new considerations such as increased focus on robustness and efficiency. Even several useful algorithms developed since the late 1900s and early 2000s still do not have probabilistic justification. The main difficulty has been in constructing probability models in high dimensions. Hence, algorithmic development has rapidly grown often without a strong footing in statistical inference frameworks. Statisticians have to work on this in order to reduce the gap between the algorithmic development and statistical inference. It is the probabilistic methods that will provide the foundation for such advancements.
The interest of the scientific community has always been in the interpretation of findings after statistical analysis. With the rise of complex models, interpretability has become a pressing concern. There is a big demand for understanding statistical roots of “black box” methodologies as well as developing new explainable and interpretable AI methods. Often, explainability is largely obtained by building parallel more traditional and better explainable statistical models in close dialogue.
Next, the ever debatable question about unification of two statistical paradigms are Frequentist and Bayesian. Referring to the history of statistics, the rationale behind the statistical methods were frequentist, Bayesian, or a combination of the two. But the question in front of us is: do we need yet another school of reasoning to answer challenging and complex questions of today?
All of these recent developments will have an influence on the future publications of SJS. The hallmarks of a good manuscript such as novelty, impact, clarity, consideration of previous work, sufficient theoretical support, and experimental validation remain even in the computer age statistical inference. Three areas of interest continue to be relevant, namely, statistical design, statistical modeling and methods, and statistical evidence and interpretation. The rising needs of modern statistical applications should be reflected in the manuscripts as well including consideration of scalability and computational aspects, and rigorous benchmarking of new methods to other recent methodologies.
In closing, we are excited about our roles as editors of SJS and encourage the statistics community to utilize SJS as a venue for sharing their top research works. We look forward to your submissions over the coming years.