Big Data, Atmospheric Dispersion Statistics and the Truth About Fukushima Daiishi

Features

  • Author: Lillian Pierson
  • Date: 04 Dec 2013
  • Copyright: Image appears courtesy of iStock Photo

Any recent Google News search on “Fukushima Daiishi” indicates that things are as scary as ever. In late October, Typhoon Francisco brought more than 1-foot of rain to Japan, adding to the volume and flow of groundwater being contaminated by radioactive leaks at the facility. On November 3rd, 2013, the Fukushima prefecture was rocked with a 5.0 magnitude earthquake. While the Japanese finally agreed to allow the international community to help decommission the plant, this expedition is touted as the ‘most dangerous nuclear clean-up in history’. The clean-up is expected to take decades to complete and will involve removing more than 1,300 spent nuclear fuel rods (over 400 tons total) that are currently stored in Reactor No. 4. The quantity of radiation at risk of release is 14,000 times that released in Hiroshima. The fuel rods being removed are already damaged, therefore each rod must be isolated, submerged, and removed manually to avoid air exposure, explosion, and atmospheric dispersion. Any slightest error, any earthquakes or any tsunamis during the cleanup process could cause an explosion and radioactive release greater than that of Chernobyl or Hiroshima. In short, some two years after the 2011 earthquake that initially damaged the power centre, the Fukushima nightmare lingers on.

It sounds like utter doomsday and despair, but there is hope yet. In a recent interview, Dr. Andreas Stohl spoke frankly about his work in modeling the atmospheric dispersion of radioactive materials released at Fukushima, about his opinion on the culpability related to this disaster, and about the environmental and human health risks associated with Fukushima at this time. Dr. Stohl is a senior scientist at the Norwegian Institute of Air Research (NILU) and he led the most comprehensive research study every completed about atmospheric radioactive release from the 2011 Fukushima fallout.

thumbnail image: Big Data, Atmospheric Dispersion Statistics and the Truth About Fukushima Daiishi

Using Sensor Data and FORTRAN Dispersion Models to Understand the Air

Dr. Stohl and his counterparts utilized FLEXPART dispersion modeling and an inversion algorithm in order to forensically determine how much radioactive material was released as a result of the damage rendered to the Fukushima Daiichi nuclear plant during Japan’s 2011 earthquake. FLEXPART is a Lagrangian particle dispersion model. There are two main types of air pollution dispersion models, Eulerian grid-point models and Lagrangian models. FLEXPART, was originally developed by Dr. Stohl in order to model long-range dispersion of radioactive substances in the event of a nuclear power emergency. The Lagrangian air pollution dispersion model is unique because it has no fixed grid system and instead follows the motion of computational particles which can represent either real particulates or gases. It moves the particles with the large-scale winds taken from meteorological analysis or forecast data and has a stochastic description of motions that are unresolved in the meteorological input data. As particles disperse, and within a moving frame of reference, the model calculates statistically probable trajectories based on particle motion. The FLEXPART model utilizes the Langevin equation and components of the Wiener process in order to model particle dispersion under turbulent conditions. It was developed for modeling particle dispersion from local scales of just a few kilometers, all the way up to global scales in the case of world-wide contaminant dispersion.

Both the FLEXPART model and the inversion algorithm are built in FORTRAN. FORTRAN is an imperative programming language that is optimal for science, engineering, and mathematical data processing. In his modeling analysis, Dr. Stohl utilized sensor data that is continuously being collected by the United Nations, on a global scale, as part of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The large number of FLEXPART model calculations and unknowns in the inversion process, as well as the large number of observations, necessitates that FORTRAN be used to process the data, rather than some lighter, more novel method like Hadoop processing, Python scripting, R scripting, or MATLAB.

The FLEXPART component of the analysis simulates and predictively models the transport and spread of radioactive materials released at different times of the accident as well as at different release heights (e.g., during explosions). The inversion algorithm is used to retroactively determine emissions rates based on material transport patterns and upon quantities of radioactive material detected at the CTBTO monitoring stations. This inverse model utilizes LAPACK, the linear algebra software library for use in FORTRAN, in order to quickly perform advanced statistics on incredible volumes of model data and time-series data that are being collected at each of the stations.

In his modeling analysis, Dr. Stohl utilized sensor data that is continuously being collected by the United Nations, on a global scale, as part of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The large number of FLEXPART model calculations and unknowns in the inversion process, as well as the large number of observations, necessitates that FORTRAN be used to process the data, rather than some lighter, more novel method like Hadoop processing, Python scripting, R scripting, or MATLAB.

Who is to Blame for the Fukushima Daiishi Catastrophe?

While Stohl’s study determined that the initial radioactive release from Fukushima was much greater than that publicly claimed by the Japanese government, Dr. Stohl has stated “I don’t think there was lying by the Japanese. I think they themselves did not know what was happening in the plant. Some things they still don’t know.” Despite many complaints that a lot of radioactive material release could have been avoided by more responsible post-disaster operations on the part of the Japanese, Dr. Stohl counters that “I am not convinced that other people would have done it better. There is no real good way to handle the situation.”

Instead of placing blame on the Japanese government, Dr. Stohl believes that the risk of radioactive material release into the environment simply goes with the territory. If humans are going to build and use these nuclear power plants, then there is going to continue to be nuclear catastrophes that endanger human health and the environment. “You can make them safer, but when you add more (nuclear) power plants, I am wondering how much ‘safer’ really compensates. You could not really anticipate the unexpected. We were lucky so far with Chernobyl and Fukushima being far from humans. Just imagine if it was New York.”

Current Risks Associated with Fukushima Daiishi

Dr. Stohl spoke extensively about long-range and short-range implications of the ongoing problems at Fukushima Daiichi. The greatest risk to people living far away from Fukushima is from atmospheric dispersion of radioactive materials. The good news is that the risk of atmospheric dispersion is short-lived after an exposure of radioactive material to the air. For instance, when tropical storms pass over Fukushima today, there is no immediate danger of release to the atmosphere because, despite all the existing problems, radioactive materials are not exposed to the atmosphere at this time. This would only change in the event of some sort of structural collapse or accident that resulted in nuclear materials being directly exposed and reacting to atmospheric gases. Although monitoring stations world-wide have shown elevated levels of Cesium in the Northern Hemisphere, this is related to the extraordinary sensitivity of the sensor network. When elevated Cesium counts are read against an extremely low backdrop they appear more alarming than they should. According to Stohl, none of the stations outside Japan are reading a Cesium concentration that is anywhere near close enough to cause harm to human health. There has been no excess death outside Japan related to the atmospheric dispersion of radioactive materials from Fukushima. In fact, except for the interval immediately following the 2011 earthquake, places as close as Tokyo have not even been at risk for atmospheric exposure to radiation. (Tokyo was lucky to have escaped the ill-fate of a radiation-rich precipitation event that could have occurred back in 2011.)

Currently, the only real and ongoing environmental detriment is to the local environment near Fukushima Daiichi. The only chance for additional atmospheric dispersion of radioactive materials would be if there is another accident, structural collapse, or natural disaster that exposes radioactive materials to the atmosphere. While it could happen, the only current damage that is being caused by Fukushima is the damage to the soil and water near the site. The land and sea around Fukushima will be contaminated for hundreds of years. As distressing as it is to think of radioactive material spilling into the sea, the sheer volume of the ocean has a tendency to mitigate long-range risks to human health. While, of course, dilution is not the solution to pollution, dilution and natural attenuation will help to reduce the concentrations of radioactive materials that are being spilled into the ocean. According to Dr. Stohl, with respect to environmental damage that is being caused by problems over at Fukushima, “what’s required now are engineers and people knowledgeable about soil and water.”

Sources

Stohl, A., P. Seibert, G. Wotawa, D. Arnold, J. F. Burkhart, S. Eckhardt, C. Tapia, A. Vargas, and T. J. Yasunari (2012): Xenon-133 and caesium-137 releases into the atmosphere from the Fukushima Dai-ichi nuclear power plant: determination of the source term, atmospheric dispersion, and deposition. Atmos. Chem. Phys. 12, 2313-2343, doi:10.5194/acp-12-2313-2012.219.pdf (available in pdf format)

Stohl, A., P. Seibert, and G. Wotawa (2012): The total release of xenon-133 from the Fukushima Dai-ichi nuclear power plant accident. J. Environ. Radioact. 112, 155-159, doi:10.1016/j.jenvrad.2012.06.001.222.pdf (available in pdf format)

Bio

Lillian Pierson is an environmental engineer that specializes in spatial data science, data visualization, digital humanitarian response, and journalism. Her core focus is on building data-driven solutions to solve environmental health and development problems in less-developed countries. Her portfolio blog is at LillianPierson.com. She can also be found on Twitter (@LillianPierson) and on Google+ as Lillian Pierson.

Related Topics

Related Publications

Related Content

Site Footer

Address:

This website is provided by John Wiley & Sons Limited, The Atrium, Southern Gate, Chichester, West Sussex PO19 8SQ (Company No: 00641132, VAT No: 376766987)

Published features on StatisticsViews.com are checked for statistical accuracy by a panel from the European Network for Business and Industrial Statistics (ENBIS)   to whom Wiley and StatisticsViews.com express their gratitude. This panel are: Ron Kenett, David Steinberg, Shirley Coleman, Irena Ograjenšek, Fabrizio Ruggeri, Rainer Göb, Philippe Castagliola, Xavier Tort-Martorell, Bart De Ketelaere, Antonio Pievatolo, Martina Vandebroek, Lance Mitchell, Gilbert Saporta, Helmut Waldl and Stelios Psarakis.