"For me, the fun of working with scientists and engineers is helping them generate information-rich data through designed experiments": An interview with Bradley Jones

Features

  • Author: Ajay Ohri
  • Date: 27 Oct 2015
  • Copyright: Image appears courtesy of SAS

Dr Bradley Jones is the Principal Research Fellow in the JMP division of SAS, where he is responsible for the development of new methods in design of experiments (DOE). He built the JMP Custom Designer, a general and powerful tool for generating optimal experimental designs. He holds a patent on the use of DOE for minimizing registration errors in the manufacture of laminated circuit boards and is the inventor of the prediction profile plot for interactive exploration of multiple input and output response surfaces.

Before joining SAS in 1997, he was the principal statistician at The MathWorks Inc., where he designed and implemented the MATLAB Statistics Toolbox. He was Chief Scientist and a founding partner of Catalyst Inc., a company created to support the use of computer-aided DOE in industry.

Jones is the 2012 recipient of the Statistics in Chemistry award from the American Statistical Association (ASA). In both 2009 and 2011, he received the American Society for Quality’s Brumbaugh Award for the paper making the largest contribution to industrial quality control. He also won the 2010 Lloyd S. Nelson Award for the article having the greatest immediate impact to practitioners. Jones is the Editor-in-Chief of the Journal of Quality Technology, a Fellow of the ASA and co-author of the award winning Optimal Design of Experiments with Peter Goos. He holds a master’s degree in statistics from Florida State University and a PhD in applied economic sciences from the University of Antwerp, Belgium.

Ajay Ohri interviews Dr Jones on his career and work at SAS.

thumbnail image:

1. You are the Principal Research Fellow for JMP. Can you describe to us your current research interests, and what your JMP team does in statistical research and development?

Broadly speaking, my research is in the area of computer-aided design and analysis of experiments. Using computers to generate designed experiments requires efficient optimization algorithms, which has been the focus of several of my research papers.

Design of experiments (DOE) is a research area with multiple specialties. I have worked primarily in optimal design, creating algorithms for generating designs having special features.

One important practical example is the generation of optimal split-plot designs. These are useful in industrial applications because many processes have factors that are hard to change from one run to the next. Holding such factors constant over several runs creates a restriction in randomization, which forces the special split-plot structure.

I am also interested in space-filling designs that are useful in studies of deterministic computer simulations. Ryan Lekivetz, one of my JMP colleagues, and I created a new kind of space-filling design that we call a Fast Flexible Filling design. One advantage of these designs over other options is that they allow for constraints on the factor settings.

Another member of my JMP team is Joseph Morgan, who does research and algorithmic work in covering arrays. These are special purpose designs that are invaluable in testing software systems.

Recently, my colleague Chris Nachtsheim from the Carlson School of Management at the University of Minnesota, and I have been working on expanding the applicability of Definitive Screening Designs (DSDs), which we invented in 2011. Originally DSDs could only accommodate continuous factors, but we have written papers that describe how to add two-level categorical factors to a DSD and also how to block a DSD orthogonally.

2. Describe your career journey from student to researcher today. What got you fascinated by statistics? How different is statistics education today as compared to your time as a student?

As an undergraduate, I studied chemistry at Caltech. In graduate school, I switched to statistics because, as John Tukey said, “The best thing about being a statistician is that you get to play in everyone’s backyard.” For me, the fun of working with scientists and engineers is helping them generate information-rich data through designed experiments, and providing powerful analytical and graphical tools to help them turn that information into know-how.

Statistics education is very different from the way it was when I was a student. My statistics classes were mostly about proving theorems. Today’s statistical methods are hugely influenced by the power of modern computers. University statistics departments are scrambling to create programs for the next generation of data analysts.

3. You are generally considered one of the gurus of design of experiments as aided by computers. What is your perspective on DOE? How has this view influenced your journey and your contribution to the field?

I started my career as an internal statistical consultant in industry. After five years, I switched from direct consulting to writing statistical software, including software for generating designed experiments, for use by scientists and engineers.
I became interested in computer-aided design of experiments because my personal experience in applying textbook designs in industry was often frustrating. Sometimes, it is impossible to find a textbook design that matches the constraints of a system or process under study. By contrast, a computer-generated design can tailor the runs to the constraints.
For the last quarter of a century, my aim has been to improve the generality of methods and software for DOE so that they can handle more than 90 percent of practical problems in industry. Doing this work required inventing new methodology as well as writing the code to implement it.

4. How can computer-aided DOE be used for newer areas like the Internet of Things, etc.?

A company can equip its products with the capability to receive and send data through the Internet. By making small designed changes in the operating parameters of these products, the company could make performance improvements on individual units while they are in the field.
 
5. I took DOE as a grad school student at the University of Tennessee and found it quite tough compared to, say, data mining and regression (I audited). Even my fellow students felt that data mining could lead to better job prospects. Do you think DOE is underutilized as a statistical technique by industry or by certain sectors? What can help with educating more students interested in design of experiments, so they can help execute it better when they are in leading positions in industry?

DOE is potentially the most cost-beneficial statistical technique, but only if it is used. Unfortunately, engineers do not usually learn about DOE in school, and very few companies apply DOE as a standard operating procedure. So, yes, DOE is underutilized by industry. Typically, DOE is taught by rote using pre-packaged designs. This makes it hard for an engineer to see the practical applicability of DOE. In addition, most DOE texts devote most of their pages to analysis rather than the core principles of design. Students do not learn how to evaluate and compare prospective designs for their appropriateness to a specific problem. The textbooks (and professors) need to catch up with the software.

6. What other statistical techniques do you like? Why do you like them?

There are a number of new model selection techniques like the LASSO and its extensions that I think are promising. False discovery rate controlling procedures are important for avoiding the embarrassment of studies that fail to replicate their findings.

7. How can we train statisticians in computer science, and how can we train computer scientists robustly in statistics, given that they are different departments at universities and industry needs cross-domain thinking?

I think that the necessary cross-domain collaboration is happening already. Many universities are creating new programs in analytics that couple statistics with computer science. This is in response to the demand for trained people who can handle “Big Data.”
 
8. Name some cool case studies where you’ve seen your research in action and the impacts they created.

My favourite case study is about Novomer, a green energy company. Using a series of Definitive Screening Designs, they have developed a catalytic reaction that converts carbon dioxide in smoke stacks into plastic. The VP of Catalyst Development at Novomer, Scott Allen, and I shared the 2012 Statistics in Chemistry Award of the American Statistical Association for that project.

9. How important is work-life balance for a long sustained career life for a researcher or statistician? What do you do to relax? How does SAS as an employer help with that balance?


Spending many hours a day in front of a computer screen is a fact of life for me. I really appreciate the fact that SAS maintains a recreation and fitness center for its staff. I go there every day to get aerobic exercise. I find that some of my best ideas occur to me while am using an elliptical machine.

Related Topics

Related Publications

Related Content

Site Footer

Address:

This website is provided by John Wiley & Sons Limited, The Atrium, Southern Gate, Chichester, West Sussex PO19 8SQ (Company No: 00641132, VAT No: 376766987)

Published features on StatisticsViews.com are checked for statistical accuracy by a panel from the European Network for Business and Industrial Statistics (ENBIS)   to whom Wiley and StatisticsViews.com express their gratitude. This panel are: Ron Kenett, David Steinberg, Shirley Coleman, Irena Ograjenšek, Fabrizio Ruggeri, Rainer Göb, Philippe Castagliola, Xavier Tort-Martorell, Bart De Ketelaere, Antonio Pievatolo, Martina Vandebroek, Lance Mitchell, Gilbert Saporta, Helmut Waldl and Stelios Psarakis.