This year the Royal Statistical Society has awarded the Greenfield Industrial Medal to Professor Ron Kenett. The award is named after Tony Greenfield and was founded in 1991. It ‘is aimed at encouraging, and promoting the recognition of, the application of statistical methods to industrial processes. It is awarded for contributions to the effective application of statistical methods to the manufacturing and allied industries. Submissions for the award may be supported by published papers and/or industrial reports for which publication in whole or in part is not restricted. The emphasis will be on effective application’ (Royal Statistical Society).
Yesterday Professor Kenett completed his Presidency of the Israel Statistical Association. StatisticsViews.com talks to Professor Kenett about his Presidency of the ISA and of ENBIS, his business KPA, his statistical contributions to business and industry, working with Bill Hunter and George Box and the future of teaching statistics.
1. Congratulations on being awarded this year’s Greenfield Industrial Medal by the Royal Statistical Society for your ‘extensive work in the development and application of statistics in business and industry, and in recognition of (your) significant and sustained contributions as author, teacher and active practitioner.’ When and how did you first become aware of statistics as a discipline?
In September 1971, at the Old Huxley Building in South Kensington (which is now part of the V&A Museum), with Professor David Cox and Ms Snell as our teachers of the Introduction to Statistics course which was part of the Mathematics Undergraduate Curriculum at Imperial College. At the time, I assumed that it was usual for senior professors to introduce young students to the subject. I found later that Sir David was a unique example. I call his approach, which for me proved very effective, the Cox Model (of education…). In the third year advanced courses on decision theory (Rodney Coleman) and experimental design (Ms White), Professor Cox was running the tutorial sessions where we solved exercises and discussed problems in small groups. This is what created my interest in the field.
2. Yesterday was your last day as President of the Israel Statistical Association. What were your main priorities/ objectives whilst President, especially with regards to this year being the International Year of Statistics?
Indeed, I am now Past President. ISA has been formed in 1975 with its first president being Roberto Bachi. Louis Guttman followed Bachi and I was the 19th president of the society. The ISA stated objective is to promote the methods and applications of statistics in Israel. During my two years term we organised 16 events on topics of Biostatistics, Risk Models, Quality, and Data Mining and Statistical methodology. The goal was to reach a wide audience of statistics practitioners and put statistics in the driving seat as a discipline that can provide leading edge tools and methods.In many disciplines today people are heavily using and developing Statistical methods without declaring themselves as Statisticians. We wanted to change that. The International Year of Statistics provided many opportunities to promote such activities.
In many disciplines today people are heavily using and developing Statistical methods without declaring themselves as Statisticians. We wanted to change that. The International Year of Statistics provided many opportunities to promote such activities.
3. How do you think the Israel Statistical Association has evolved during your time there, overall, and adapted to the changing needs of the statistical community?
Organisations based on volunteers are usually having difficulties adapting to changing needs. My approach was to involve as many members as possible and provide full visibility into the decision making of the executive committee. I had a similar experience as President of the European Network for Business and Industrial Statistics (ENBIS) where, under my term, we moved from an autocratic centralised organisation to an open society with full visibility of the financial figures to the membership and active involvement of the executive committee. There, I was the 7th president and the change I instituted was not trivial to implement. We found that the official papers used at the time of registration of ENBIS had to be redrawn and basic entities, like an audit committee, had to be formed. For long term survival and growth, volunteer based organisations need to be inclusive and open to accept a variety of contributions. On the other hand, members in office must recognise the responsibility they were given by being elected and must ensure continuity and proper handover of documents and information. Unfortunately in such societies there is nothing to prevent people from acting irresponsibly, and some have.
4. You have an extremely impressive career path including being Director of Statistical Methods for Tadiran Telecommunications Corporation, researcher at Bell Laboratories, Professor of Management at SUNY and Professor in the De Castro Center for Applied Mathematics and Statistics and the University of Turin in Italy. What are your memories when you look back on your times in these roles and what were your main achievements?
The common denominator in all this is a lifelong interest in the challenge of solving real world problems with mathematical tools and statistical thinking. To me, this combination is what makes statistics such a fascinating discipline. Our science is feeding on problems from a variety of disciplines and we must be able to interact with others in order to be relevant. Sometimes the solution involves something as basic as asking: what is the problem? After getting a PhD in Mathematics from the Weizmann Institute I got a position at the Department of Statistics at the University of Wisconsin Madison.
My neighbour in the adjacent room was Bill Hunter. He told me once that a couple of chemists came to meet him for statistical advice and he simply asked them: what is the problem? They sat in his room arguing for an hour, eventually figuring out that they had different views and, eventually, after clarifying what the problem is, they both realised that it was not really a problem anymore. They left after thanking him for a very effective consulting session. Bill knew how to practice active listening and provide an opportunity for thinking objectively. I learned a lot from people like Bill Hunter and George Box and was lucky enough to have been given the opportunity to apply some of it.
The common denominator in all this is a lifelong interest in the challenge of solving real world problems with mathematical tools and statistical thinking. To me, this combination is what makes statistics such a fascinating discipline.
At Tadiran, in the mid-1980s we were the first in Israel to implement Statistical Process Control and process improvement in an electronic assembly line. Using designed experiments we reduced solder defects after wave soldering by a factor of 10,000 (yes, ten thousand), with dramatic impact on lead time, quality and costs. What we did was later emulated in hundreds of companies, so that the impact was huge. At SUNY Binghamton, in the 1990s, I started a quality management program which got a special award from General Electric. We developed unique simulators that allowed for hands on training of university students and employees of many companies in the neighbourhood such as IBM, General Dynamics, Singer-Link, Universal Instruments and many others. We learned then that teaching statistics required a different set up from teaching Mathematics. It is at Binghamton that I started working on Modern Industrial Statistics: Design and Control of Quality and Reliability with Prof Shelley Zacks. The book was eventually published in 1998 by Duxbury Press, followed by Spanish and Chinese editions. We are now working on a revised and expanded second edition called Modern Industrial Statistics: with applications using R, MINITAB and JMP to be published by Wiley, hopefully in 2013. I started teaching and supervising students at the University of Turin in 2002. The university was established in 1404 and has now 67000 students. Together with my colleague and friend there, Professor Roberto Corradetti, we carried out many fascinating projects in applied statistics including a competition on how to best analyse customer surveys data, which we called the Customer Satisfaction Surveys Olympics.
5. In being awarded the Greenfield Industrial Medal, it was also noted that you have helped draw attention to statistical methods in business, such as Six Sigma quality systems, risk management and surveys of employees. How have you employed statistical methodology in your consulting?
What you refer to are examples of the challenge of solving real world problems with mathematical tools and statistical thinking. Take for example Six Sigma, or what is now often called Lean Sigma. Some people refer to it using a mathematical definition which calculates 3.4 defects per million opportunities for processes operating within six standard deviations from specification limits. In reality it is much more than that. It is a strategy for making your company more competitive by improving products and processes and reducing the cost of waste. Quoting the late Robert Galvin, past chairman of the board of Motorola Inc., from the foreword of the 1998 edition of Modern Industrial Statistics: “At Motorola we use statistical methods daily throughout all of our disciplines to synthesize an abundance of data to derive concrete actions….How has the use of statistical methods within Motorola Six Sigma initiative, across disciplines, contributed to our growth? Over the past decade we have reduced in-process defects by over 300 fold, which has resulted in a cumulative manufacturing cost savings of over 11 billion dollars”. Again, I was fortunate enough to see this happen in several organizations with whom I worked. They demonstrated how to improve their competitiveness by applying mathematical models and statistical thinking.
Six sigma…is a strategy for making your company more competitive by improving products and processes and reducing the cost of waste.
6. As a university professor, what do you think the future of teaching statistics will be? What do you think will be the upcoming challenges in engaging students?
The classical “introduction to Statistics” courses have, in general, done much harm to the profession. They are usually taught by graduate students with no practical experience who focus on the mathematical properties of the tools without conveying any aspect of statistical thinking. One should have a minimum of 3 years of practical experience before being allowed to teach statistics, specially the introductory courses. The future must be different. We now need to address new data structures and “big data”. Modern Statistical software allows for visual exploration of data through dynamic linking and multivariate views. We also need to focus on teaching concepts as opposed to techniques and also assess the level of understanding of students, not simple problem solving skills obtained through memorisation. I have been collaborating on this fascinating topic with Professor Uri Shafrir from the University of Toronto. Uri developed a unique approach for evaluating “deep understanding” based on concept science. He calls it MERLO (Meaning Equivalence Reusable Learning Objects). Also worth mentioning is the growing interest in distance education and massive open online courses (MOOC). In the fall of 2011 Stanford University launched 3 MOOCs, each of which had an enrolment of about 100,000. If the level of understanding of MOOC participants will not be managed, the approach will be short lived. This impacts both what is taught in statistics and how it is taught. New courses will need to be designed to support these changes.
7. Over the years, how has your teaching, consulting, and research motivated and influenced each other? Do you get research ideas from statistics and incorporate your ideas into your teaching?
You are hitting the nail right on the head. The three elements of teaching, consulting and research are feeding each other. Teachers who do not do consulting or consultants who do not do research are, in my opinion, handicapped. Please keep in mind that my view of statistics is based on such an interaction. There are actually very few platforms that facilitate such interactions. ENBIS is one example where attendance in annual conferences is balanced between academia, business and industry. To make this happen, everyone has to be willing to learn from others. Researchers must be willing to learn from speakers presenting case studies and practitioners must be open to hear theory with no immediate relevance to the problems they work on. This approach got me to work on the concept of Information Quality (InfoQ) that assesses the utility of an analysis of a certain data set given specific goals. When I teach a course on statistical methods I usually start by referring to InfoQ so that students don’t forget that the objective is to generate information and knowledge, and not just produce a report, with a t-test.
8. You have authored over 160 publications including Modern Analysis of Customer Surveys with Applications using R, Statistical Methods in Healthcare, and Operational Risk Management: A Practical Approach to Intelligent Data Analysis for Wiley. Is there a particular article or book that you are most proud of?
You will be surprised. Joseph Juran asked me to contribute a chapter to a book he edited on the History of Managing for Quality(Quality Press, 1995). The second chapter in that book, which I am very proud of, is titled “Managing for Quality in Ancient Israel”. To prepare it I spent almost a year researching biblical and Talmudic texts and found fascinating examples of applications of statistics, quality improvement, specifications, six sigma quality etc. The papers I am particularly proud of include the 2013 paper with Galit Shmueli on InfoQ in Journal of the Royal Statistical Sociey Series (A), the 2012 paper entitled “On Assessing the Performance of Sequential Procedures for Detecting a Change” in Quality and Reliability Engineering International, the 2011 paper on “Modern Analysis of Customer Surveys: comparison of models and integrated analysis” in Applied Stochastic Models in Business and Industry and, going back in time: “Two Methods for Comparing Pareto Charts” Journal of Quality Technology(1991), “On Sequential Detection of a Shift in the Probability of a Rare Event” Journal of the American Statistical Association(1983) and “A Test for Detecting Outlying Cells in the Multinomial Distribution and Two-Way Contingency Tables” Journal of the American Statistical Association(1980). Each of these papers made a theoretical contribution based on observations derived from my consulting activity.
Joseph Juran asked me to contribute a chapter to a book he edited on the History of Managing for Quality (Quality Press, 1995). The second chapter in that book, which I am very proud of, is titled “Managing for Quality in Ancient Israel”. To prepare it I spent almost a year researching biblical and Talmudic texts and found fascinating examples of applications of statistics, quality improvement, specifications, six sigma quality etc.
9. You are also Editor-in-Chief of the Wiley Encyclopedia of Statistics in Quality and Reliability. What are the main challenges that come with this responsibility?
Here I must give due credit to David Hughes from Wiley. In the early stage, David took the three editors in chief through a three day training session that taught us what an Encyclopaedia should look like in terms of format and cross referencing. This, together with the excellent project management support we got, made the project doable. It was qualified as “The major reference work for industrial statistics in academia, industry and services in the 21st century.” The task of putting together a four volume document with 2173 pages containing 431 articles by 415 international experts was colossal. We started by mapping the body of knowledge of statistics in Quality and Reliability. We identified 12 domains and potential domain editors. We then discussed with the domain editors the topics to be included in their sections, leaving them room for the final decision but keeping the whole project coherent. The fact that the Encyclopaedia is also available in electronic format is a clear advantage and allows interested parties to prepare a tailored set of entries in the context of a course or workshop. A spin off of the Encyclopaedia is the edited volume titled Statistical methods in healthcare where some of the entries related to healthcare in the Encyclopaedia were expanded into full-fledged chapters.
10. You are now CEO of KPA – please could you tell us more about your role at KPA and your work there?
I created KPA with a partner twenty years ago. Initially we focused on quality management consulting with an emphasis on the use of data and proper statistical analysis. In 1997 we formed a start-up called Babylon with an entrepreneur who developed a technology allowing for point and click translation between 32 languages using advanced OCR (optical character recognition) technology adapted for digital monitors. My partner took up the management of Babylon which was later acquired by a Venture Capital Fund and eventually was put on the market as a public company. There are now over 150 Million users of Babylon worldwide. Since 1997 KPA has grown steadily to a team of 30 professionals. Two years ago we rebranded the company towards generation of insights through analytics and, among other things, started offering the technologies we developed as cloud hosted web services. We provide our customers with a range of services including customer surveys, predictive analytics, industrial statistics, risk models, employee surveys, biostatistics and six/lean sigma process improvement facilitation. A particularly meaningful offering is a cloud hosted web service called SPClive365 that allows metal work subcontractors of global firms to upload data directly from their CMM systems and get an on line live control chart with special reports listing trends and measurements beyond control limits, to support Statistical Process Control. The global firm can integrate the data and conduct a data based tolerance analysis of their system. In deploying such a system, KPA combines knowledge and expertise in industrial statistics, cloud and web technologies and change management to ensure effective and efficient deployment. This is one example where data is driving decisions in process control and engineering design, by exploiting new technologies. The system combines web technologies with R applications.
11. What has been the most exciting development that you have worked on in statistics during your career?
In the list of papers I referred to I covered some of these. During my work with Prof Sam Karlin at Stanford and then at the Weizmann Institute we developed an approach called Structural Exploratory Data Analysis (SEDA) for analysing frequencies for biomarkers in various populations. The analysis combined migration patterns with various vector distances and was, in a sense, a precursor of data mining techniques. This was published in a series of highly quoted papers in the American Journal of Human genetics in the 1980s. Other contributions included 1) the M-Test, developed with C. Fuchs, for identifying outlying cells in multinomial and contingency tables, published in Journal of the American Statistical Association 2) the definitions of Conditional Expected Delay (CED) for tracking the performance of sequential procedures and surveillance of processes that cannot be reset, in work with M. Pollak, published in Quality and Reliability Engineering International, 3) the use of bootstrapping to analyse data from designed experiments with missing data (work with D. Steinberg), published in Quality and Reliability Engineering International, 4) the concept of integrated models for analysing customer surveys (work with S. Salini published in Applied Stochastic Models in Business and Industry), 5) the Bayesian Estimate of the Current Mean (BECM) model developed with S. Zacks, and 6) the Information Quality (InfoQ) and Practical Statistical Efficiency (PSE) concepts, published in Quality and Reliability Engineering International.
12. What do you think the most important recent developments in the field have been? What do you think will be the most exciting and productive areas of research in statistics during the next few years?
Big data analytics, Bayesian networks and causality models, bootstrapping, dynamic linking in data exploration, on line analytics.
Academia needs to adapt and move from a focus on developing mathematically oriented tools to methods that are more interdisciplinary in scope. This requires joint work with cognitive scientists, experts in education, computer scientists and economists.
13. What do you see as the greatest challenges facing the profession of statistics in the coming years?
As a profession, we lost our Unique Selling Points (USP). Other disciplines are offering tools and methods that are essentially elements of statistics. The big research labs that produced many innovations at the interface of theory and practice are now gone. Academia needs to adapt and move from a focus on developing mathematically oriented tools to methods that are more interdisciplinary in scope. This requires joint work with cognitive scientists, experts in education, computer scientists and economists. Universities are not necessarily an ideal place for doing that.
14. Are there people or events that have been influential in your career?
There are several people on the list. I already mentioned David Cox. We still correspond via email and his advice and feedback has continued to be helpful. In 2005, as President of ENBIS, it was a privilege to award to him the Box medal, a small recognition for his gigantic contributions. My PhD advisor was the late Sam Karlin who set the standard for me by showing unending curiosity in many application domains and deep mathematical knowledge. Sam saw in Ronald Fisher the ultimate example of a scientist that he tried to emulate. I also mentioned Bill Hunter and George Box who showed, by example, what working with and for clients from other disciplines means. Then there is Deming. I got my job of director of statistical methods at Tadiran Telecom after I convinced the company’s president and CEO to attend a Deming seminar. When he returned, he pushed hard for implementing the Deming principles and gave me the opportunity to gain individual experience in a position Deming recommended every company should have.
I attended, in 1987, a special workshop Deming organised for statisticians and I learned from the master what statistical thinking is all about. I would like to mention the many friends made at ENBIS, starting with Shirley Coleman and Dave Stewardson, who set the foundations of ENBIS in the Pro-ENBIS project. Finally, I would like to mention the impact Tony Greenfield has had on my career. Tony is a communicator and an “agent provocateur”. He made the communication of statistical results a professional area of expertise that deserves attention, requires special tools and needs to be properly taught. It has been an eye opener to learn from him what communicating statistics is all about. In addition Tony has been able to stir up discussions and force people to consider reality. This is another unique capability of Tony that is providing companies and organisations an opportunity to innovate and be engaged in dealing with problems. Tony got in 2004 the Bill Hunter award from the American Society for Quality. The Award is presented annually to promote, encourage and acknowledge outstanding contributions to the creative development and application of statistical techniques to problem-solving in the quality field. Thank you, Tony for your many contributions in these fields.
In summary, I wanted to share with you my journey in meeting the challenge of solving real world problems with mathematical tools and statistical thinking. On this journey I worked with and met outstanding Statisticians. I also had the opportunity to contribute to companies and organisations which provided a perspective on what real life is about. With time, the gap I perceived between academic research and practical needs grew. Without closing this gap, statistics as a profession, is facing a bleak future. Some statistics departments are focused on highly theoretical work without any involvement in applications. Other departments are innovating with joint programs and emphasis on practical projects and interdisciplinary education and research. As Deming said “survival is not mandatory”. For survival, statistics as a profession needs to provide added value to fellow scientists or customers in business and industry. To achieve this we need be better at generating knowledge and impact creation.
Copyright: Image appears courtesy of Professor Kenett