Monday, February 28, 2011

The Curious Case of US Oil Production

I have lately been updating my oil production numbers and plots, first Oklahoma, Texas, and world (OTW) production then Saudi Arabia. In the OTW blog entry I mentioned that I was still working on a US production update.

Digging through my thousands of files, I was able to relocate the data for US oil production from 1857 to 2006. This data had been carefully constructed from various sources and I wrote a Leading Edge Seismos column using it in May 2008. Here is the daily production figure from that column:

Figure 1. Daily US oil production plot from 2008 Seismos column.

Since I finally found the data behind the plot, it seemed a simple matter to add the last 4 years of production  and format the plot like the others I had posted in this blog. 

Naturally, the first place to turn in such matters is the venerable BP Statistical Review.  But something was immediately confusing to me.  Note my data in Fig 1 shows annual production for the US (including offshore and Alaska) as 5.14 MMBO/D. The BP Review showed 6.84 million BO/D for the same year, a whopping 33% difference. Confused, I then checked the Energy Information Administration and found values similar to BP. 

Was I going crazy? I'm not getting any younger, you know. Maybe this was a case of some mild stroke that left me in a confused world. But no, the Seismos column was there as evidence. So maybe I just used bad data back then. But no, wikipedia shows the same thing, and I read a recent paper (Nashawi et al., 2010) containing a daily US production plot (fig 15) that syncs perfectly with mine. Here it is:

Figure 2. US production from Nashawi et al. (2010)

Well, research is what I do, so the chase was on. The result is summarized in Fig 3.  The data comes from my earlier research (green dots), BP Statistical Review 2010 (BP, red dots), the Energy Information Administration (EIA, blue dots), and one data point each from International Energy Agency (IEA, white triangle),  and American Petroleum Institute (API, white square).  The IEA number (244840  thousand tonnes per day) needs a bit of conversion.  Wolfram|Alpha shows this represents 1.68 x 10^9 barrels of oil, or 4.603 million barrels per day.  

Clearly, I would be curious to see a full series of production numbers from IEA or API, but have been unable to figure out if these groups keep historical data on the internet.  

Figure 3. The mystery in a nutshell.  My green trend is supported by 2008 and 2010 numbers from API and IEA.  Meanwhile, BP and EIA are about 30% higher.  What the heck is going on?

Figure 4. Detail of Fig 3 from 1960-2010

At this point I have no resolution to the mystery, I just wanted to lay it out for all to see.  It is hard to imagine this discrepancy is due to definitions of crude oil (e.g, is shale oil included? condensate? heavy oil?).  Any of these would be in the accounting noise, not a 30% difference.

Comments are welcome.


Liner, C., 2008, To peak or not to peak, The Leading Edge 27, 610.

Nashawi, I.S., Malallah, A., and Al-Bisharah, M., 2010,Forecasting World Crude Oil Production Using Multicyclic Hubbert Model, Energy Fuels, 24, 1788-1800.

Note: Along these same lines, North Dakota is producing more oil than ever.

Saturday, February 19, 2011

What is the SEG DISC?

As mentioned before in this blog (here, and here), I have been honored by selection as the 2012 SOciety of Exploration Geophysicists DISC presenter. My department chairman wanted a newsletter blurb. Here goes...

What is the DISC?

The SEG Distinguished Instructor Short Course (DISC) is an eight-hour, one-day short course on a topic of current and wide-spread interest. Sponsored by the SEG, it is presented at 30 locations around the world. Established in 1998, the DISC has attracted over 25,000 participants in its 14-year history.

Selection as the DISC instructor is viewed as a major honor and recognition of excellence by the SEG. The instructor is a prominent geophysicist whose work and presentation appeal to a wide audience ranging from students to mature professionals.

The DISC is recorded each year and developed into a multimedia presentation called the DigitalDISC, which features the video of the instructor, transcript of the lecture, and presentation slides. This DigitalDISC is distributed as a service to student sections, and is available for purchase from the SEG Book Mart.

The 2012 DISC, presented by Christopher Liner, is on the topic of Seismic Dispersion. Toward the end of 2012, it will be available in ebook form or print edition.

Friday, February 18, 2011

10000 Visitors

Today the Seismos blog passed ten thousand visitors, a mere 1 year and 17 days after passing the one thousand mark.  That is a ten-fold increase in just about a year and the trend is accelerating, as can be seen in the plot below.  This was generated in Excel using data from the StatCounter I have on the blog.

The blog started 4 October 2008, but the stat counter only has data from 1 Sept 2009.

I have added a trend line to the Page Loads curve and the best fit I could get was using a 4th order polynomial (see equation). Extrapolating this trend predicts the 20k mark will be reached about 670 days after 1 Sept 2009 which works out to be 3 July 2011. The next 10-fold increase to 100k should be in Feb or March 2012.

Thursday, February 10, 2011

Saudi Arabia Oil Production Update: 2009

Following an earlier post with updated Oklahoma, Texas, and world oil production plots, I was motivated to review and revise my Saudi Arabia production numbers.  KSA (Kingdom of Saudi Arabia) production began in 1936 and my plots start there, but I am unable to identify the source of my early numbers (1936-1964).  Nothing sinister, my computer files are growing at about 38GB/yr and I can't find the original data source.  For 1965-2009 I use the BP statistical review.

I must admit this was somewhat motivated by this recent story.  I have done my own estimates and may report that sometime.

For now, let's stick to the production data.  I was in KSA from 2005 through 2007 and note those years included the daily production peak (so far) and two years of decline.  As soon as I left, production went up.  Fascinating.

Tuesday, February 8, 2011

Dickman Project Overview

The goals of this DOE-funded project were to develop innovative 3D seismic attribute technologies and workflows to assess the structural integrity and heterogeneity of subsurface reservoirs with potential for CO2 sequestration. Our specific objectives were to apply advanced seismic attributes to aide in quantifying reservoir properies and lateral continuity of CO2 sequestration targets.

Our study area (Figure 1) is the Dickman field in Ness County, Kansas, a type locality for the geology that will be encountered for CO2 sequestration projects from northern Oklahoma across the U.S. mid-continent. Since its discovery in 1962, the Dickman Field has produced about 1.7 million barrels of oil from porous Mississippian carbonates (Figures 2 and 3) and basal Pennsylvanian sandstone (Figure 4) that locally develops on a regional MIss-Penn unconformity surface. The Dickman field includes a small structural closure at about 4400 ft drilling depth. Project data includes 3.3 square miles of 3D seismic data, 142 wells, with log, some core, and oil/water production data available. Only two wells penetrate the deep saline aquifer. Geological and seismic data were integrated to create a geological property model and a flow simulation grid.

We systematically tested over a dozen seismic attributes, finding that curvature, SPICE, and ANT were particularly useful for mapping discontinuities in the data that likely indicated fracture trends. Recently we have been studying spectral decomposition as a way to detect additional channel details and fracture trends.

Our simulation results in the deep saline aquifer indicate two effective ways of reducing free CO2: a) injecting CO2 with brine water, and b) horizontal well injection. A tuned combination of these methods can reduce the amount of free CO2 in the aquifer from over 50% to less than 10%.

Figure 1. Dickman field site and description of available data.

Figure 2. Annotated type log for Dickman project area (black circle in Fig. 1).

Figure 3. Chart to the left is the stratigraphic rank accepted by the Kansas Geological Survey (Sawin et al., 2008). The chart to the right shows correlation to Dickman local stratigraphic units, with higher confidence correlations in bold. The blue vertical bar indicates the target strata for Dickman research.

Figure 4. Time slice through 3D amplitude volume at 850 ms (approximate top Miss level) showing clear evidence of incised channel. Geological interpretation along highlighted line on right.

Monday, February 7, 2011

Spectral decomposition at Dickman (2)

(C. Liner and J. Seales)

In and earlier post, spectral decomposition (SD) of the the Dickman 3D data using simple bandpass filtering. This has been done using both seismic unix and SMT filtering tools. Here we consider the sensitivity of the SD output on choices of the filter parameters.

A broadband time slice (847 ms) through the data shows a well-defined channel. Aside from a slight amplitude change to the SW, we see no evidence of overbank or secondary channel features.

Figure 2 is an SU plot created using a filter centered on 41 Hz generated by a shell script. The image shows a clear feature that trends away from the channel to the SW. The filter parameters are shown along with a dash-yellow outline of the feature under discussion.

Equivalent 41 Hz time slices at 847 ms (Figures 3 and 4) were created using the trapezoid filter in SMT. Apart from grayscale color bar differences between SU and SMT, the time slice features are consistent between SU and SMT. Both filters created in SMT were centered on 41 Hz, but Figure 3 uses a 2 Hz top while Figure 4 is more constrained with a 0.2 Hz top. Visually, these results are equivalent indicating the SD attribute computed in this manner is not unstable with respect to filter parameter choices.

To study detailed variation between these SMT results, subtractive difference volumes were generated using functionality in SMT (Figures 5 and 6). These show there are, indeed, slight differences between the output of the two SMT filters created. Both figures show the same result, but with opposite amplitudes due to changing the subtraction order. The difference plots show energy in horizontal bands representing acquisition footprint (source orientation), a large fault in the NW corner, a bright karst (sinkhole) feature near the center, and a network of curved lineations of unknown origin. In an absolute sense, the maximum amplitude difference between the 2Hztop and 0.2Hztop filters is on the order of 10% in the mentioned features, otherwise it is less that 2%.

If this kind of spectral decomposition is to be generally useful, it is important to quantify the sensitivity on filter parameters beyond the visual difference described above. To accomplish this, we set out a grid of 27 points in the 3D survey area about 2500 ft apart (Figure 7). At each location, the amplitude for 2Hztop and 0.2Hztop results were measured and cross plotted in Excel (Figure 8).

Figure 7. Test point grid for amplitude cross plot between the SMT filter results.

Figure 8. Amplitude cross plot between selected locations in the two SMT filter spectral decomposition results (diagonal line would be exact equality). No significant differences are indicated,

Finally, we notice that an 847 time slice through the SPICE attribute volume (Figure 9) seems to indicate the same channel feature seen on 41 Hz spectral decomposition results. We are planning further investigation of this tantalizing relationship.

Figure 9. SPICE attribute time slice at 847 ms showing feature similar to the 41 Hz channel feature.

Thursday, February 3, 2011

Madagascar on Mac OS X

I am experimenting with Madagascar, a public domain processing system descended from SEPlib. My plan is to use seismic unix (SU) as usual for graphics and most analysis, but M brings some important new functionality that is not in SU. Specifically, more 3D applications and parallel computing capability (my power mac has 8 cores, a mini cluster).

Here is an outline of how to install Madagascar on Mac OS X. This worked on my powerbook and desktop power mac. The workflow assumes you have already installed the Xcode development tools and gfortran or g95.

(1) Download Madagascar source from sourceforge:

(2) In your downloads folder, double click on the zip file to unpack. A new folder will be created named madagascar-1.1 with all the stuff in it.

(3) Make a directory ~/usr/local (~ means your home directory)

(4) Move the folder from item (2) to ~/usr/local

(5) Install Java for Mac OS X 10.6 Update 3 Developer Package. You must have a mac developer account.

(6) As superuser, use macports to install scons. Note: this will fail while installing db46 unless you have done step (5).

(7) In the directory ~/usr/local/madagascar-1.1 type configure at the command line. This will create a file named env.csh

(8) Assuming you are using the tcsh shell, you will have a file in your home directory named .tcshrc which is a resource file for tcsh. It is invisible to the finder (due to the leading dot), but can be seen and edited using the vi editor. Open the env.csh file and copy its contents (except the leading line) to the bottom of your .tcshrc file. Close the file and quit/restart the terminal application.

(9) As superuser in ~/usr/local/madagascar-1.1 type make and when that completes type make install

(10) Exit superuser and type rehash

(11) If all goes well, you can proceed to test your install as described on this page

Here is a simple test that worked (thanks to some advice from Sergy Fomel):

================ makefile ================

================ terminal output ================

================ Graphics result ================