The dataset contains the post-processed results of the 2010 Deepwater Horizon oil spill incident at Macondo well in the Gulf of Mexico, as estimated from the simulations using the latest updated version of the oil application of the Connectivity Modeling System (CMS) or oil-CMS. In this version, the specified hydrocarbon pseudo-components are in the same droplet. The post-processing analysis assumed a simplified case of the uniform droplet size distribution at the release time; analysis yielded 4-D spatiotemporal data of the oil concentrations on a regular horizontal and vertical grid, as well as time evolution of the horizontally-cumulative oil mass, all of the data for the 167-day simulation period. CMS has a Lagrangian, particle-tracking framework, computing particle evolution and transport in the ocean interior. CMS simulation start date: April 20, 2010, 0000 UTC, and particles were tracked for 167 days. Oil particles release location: 28.736N, 88.365W, depth is 1222m or 300m above the oil well. 3000 particles were released every 2 hours, for 87 days, equivalent to total of 3132000 oil particles released during the simulation. Initial particle sizes were determined at random by the CMS in the range of 1-500 micron. Each particle contained three (3) pseudo-components accounting for the differential oil density as follows: 10% of light oil with the density of 800kg/m^3, 75% of the oil with 840 kg/m^3, and 15% of a heavy oil with 950 kg/m^3 density. The half-life decay rates of oil fractions were 30 days, 40 days, and 180 days, respectively. Ocean hydrodynamic forcing for the CMS model was used from the HYbrid Coordinate Ocean Model (HYCOM) for the Gulf of Mexico region on a 0.04-deg. horizontal grid and 40 vertical levels from the surface to 5500m. It provided daily average 3-D momentum, temperature and salinity forcing fields to the CMS model. The transport and evolution of the oil particles were tracked by the oil-CMS model during the 167 days of the simulation, recording each particle’s horizontal position, depth, diameter, and density into the model output every 2 hours. Model data need to be post-processed to obtain oil concentrations estimates. The post-processing algorithm took into the account the total amount of oil spilled during the 87-day incident as estimated from the reports (730000 tons), and the assumptions about the oil particle size distribution at the time of the release as estimated in the prior studies. The current dataset assumes the simplified case with the uniform droplet size distribution across the range of 1-500 micron. The data for the oil concentrations are daily average values in ppb units. Horizontal 0.01-degree grid covers the Gulf of Mexico (25N-30.75N, 84W-93W), and vertical grid extends from the surface to the depth of 2400m at 20m increments. Daily oil concentrations are also estimated for the following vertical layers: 0-1m, 1-20m, 20-50m, 50-200m, and 200-200m; separate files are for the layer of 0-1m and for the 0-20m layer. The data for the oil mass are horizontally-cumulative values in kg of crude oil, distributed in the water column on a vertical grid from the surface down to 2400m at 20m increments, and estimated bi-hourly corresponding to the oil-CMS model output interval. Post-processed NetCDF files were created using Matlab software package, v. R2014b, and used compression to keep file size small. Maximum compression, or ‘DeflateLevel’ = 9 was used in most of the files. Numerical simulations and post-processing were performed using a Pegasus supercomputer at the Center of Computational Science, University of Miami, during the period of 2016-2017.


Supplemental Information

Time (seconds or days), Latitude (degrees north), Longitude (degrees east), Depth (m), Oil concentrations (ppb), Horizontally-cumulative oil mass (kg) |The output from the updated oil-CMS model contains the 3-D location of oil particles, their size and density. It is further processed in order to obtain the oil concentrations (or oil mass) in the time-space. The information used for that purpose is the following i) the total oil released during the case study, estimated as 730000 tons of crude oil; ii) the total number of particles released, 3132000 particles; iii) adopted droplet size distribution (DSD) at the time of release, for the untreated oil with no dispersants added. Knowing the i) and ii), yields the amount of oil of 233kg of oil that each released particle represents under the assumption of uniform distribution from 1-500 micron. The post-processing method of computing the daily average oil concentrations includes the following: 1) estimate the mass of oil represented by each oil particle (i.e., 233kg of oil); 2) compute the scaling factor for each particle at the initial time depending on particle mass; 3) multiply the evolving particle mass by the initial scaling factor unique for each particle; 4) sum the mass of all the particles found in a given 3-D grid box at a given time (2h intervals); 5) determine the oil concentration from the mass of oil in the volume of a ocean grid box, take into the account the bathymetry for coastal areas; 6) Compute daily average oil concentrations from 2-h concentration estimates.|1. Numerical model used: updated versions of the open-source Connectivity Modeling System (CMS v.2.0) and of the oil module (not open-source) where the multi-fraction algorithm represents all the hydrocarbon compounds within a single droplet. 2. Matlab software version R2014b was used for post-processing analysis and data preparation. 3. All numerical work has been done using a Pegasus supercomputer at the Center of Computational Science, University of Miami. 4. The dataset files are in NetCDF format, with *.nc extension; the headers of the *.nc files are also duplicated in text files with the corresponding names and *.txt extension. All the files have been archived in a *.zip file.|Time scale for the oil concentrations: daily averages Horizontal grid: 0.01-degree spacing Vertical grid: 0-2400m with 20-m increments Vertical grid alternative: vertical layers as specified by layer boundaries Time scale for horizontally-cumulative oil mass: 2-h intervals||References for the CMS model: Paris C.B., J. Helgers, E. van Sebille, and A. Srinivasan, 2013: Connectivity Modeling System: A probabilistic modeling tool for the multi-scale tracking of biotic and abiotic variability in the ocean. Environmental Modelling & Software, 42, 47-54. Reference for the previous version of oil-CMS model: Paris., C.B., M. Le Hénaff, Z.M. Aman, A. Subramaniam, J. Helgers, D.-P. Wang, V.H. Kourafalou, and A. Srinivasan, 2012: Evolution of the Macondo Well Blowout: Simulating the Effects of the Circulation and Synthetic Dispersants on the Subsea Oil Transport. Environ. Sci. Technol., 46, 13293−13302. http://dx.doi.org/10.1021/es303197h


Provide state-of-the-art modeling data of the far-field hind-cast simulations of the 2010 Deepwater Horizon oil spill, with the simplified case of uniform oil droplet size distribution at the time of the release. To be used for comparative analysis with the "control" case with untreated oil, with the observations, with other validation techniques, as well as with other modeling efforts.


oil modeling, Deepwater Horizon 2010 oil spill modeling, spatial oil distribution, deep sea oil distribution, Connectivity Modeling System, daily oil concentrations, subsurface hydrocarbon transport, far-field oil transport modeling, Macondo Well




July 2017

Point of Contact


Claire B. Paris-Limouzy


University of Miami / Rosenstiel School of Marine and Atmospheric Science

Funding Source




Rights Information

Creative Commons License
This work is licensed under a Creative Commons Public Domain Dedication 1.0 License.