Thursday, June 10, 2010

Teaching Statement

Trite as it may sound, the goal of teaching is for students to learn. The courses I now teach are graduate classes and tailored to that audience. When it makes sense for a subject, I like to emphasize project data, teamwork, writing, and presentation skills. Below I discuss a couple of specific examples.

As an indication of quality, I was ranked 2nd for teaching in the 2010 internal faculty evaluation of the EAS Department, a large group (27) that includes three past winners of the college outstanding teacher award.

Not all teaching is in the classroom, of course. One-on-one work with graduate students is essential and I run two open meetings per week for students I advise. I also maintain a blog with helpful information and notices.

During my 15 years at The University of Tulsa I taught a wide array of courses. These included freshman-level Physical Geology and The Earth in Space (which I developed). Owing to the small faculty size, many courses were taught as mixed undergrad/grad. In this category were Petroleum Seismology, Advanced Petroleum Seismology, Advanced Global Seismology, and Environmental Geophysics. In addition, the courses Advanced Seismic Data Processing and Seismic Stratigraphy where pure graduate classes. The Environmental Geophysics course included field exercises using electrical conductivity and ground-penetrating radar equipment. Most of these graduate courses had less than eight students, while at The University of Houston there were 49 in my spring 2010 Geophysical Data Analysis (largest graduate course in the department). One has to be flexible and patient to deal with these extremes.

At the University of Houston I have been involved in the 2009 and 2011 Geophysical Field Camp held near Red Lodge Montana. My field work at camp has involved 2D and 3D seismic seismic surveys using single and multicomponent receivers and various sources. In the classroom at UH, I routinely teach the graduate classes 3D Seismic Interpretation I (synthetic seismogram, girdding, horizon tracking, faults, prospect generation) and Geophysical Data Processing (the science behind seismic data processing).

Seismic Interpretation I is a good example of how I teach. It would be possible to teach such a class in traditional lecture format, but that would leave the students not ready to do significant work themselves on, say, a thesis project. So I break the class into three major assignments. Each is a mini-workshop lasting 5-7 class periods and involves working with real data in a commercial interpretation system. The hard deadline is intentional to ramp up the sense importance and simulate real-life deadlines for company projects, lease sales, and proposal submission deadlines. The first assignment is not orally presented, but graded for slide style, clarity, logic, and completeness. Assignment 2 is more ambitious, on a shorter deadline, and slide element grading stricter. This process can lead to some truly remarkable improvement in the mechanics of learning sophisticated software and preparing a presentation. The last assignment is to generate a prospect using industry Gulf of Mexico data. Two class periods are used to generate three leads, one of which is chosen at random as 'the prospect'. This project requires fault mapping, structural interpretation, amplitude analysis, reserve estimates, and economics. In the fashion of the industry, prospects are named and will be promoted to an audience. Once the prospect presentations are locked down (deadline again!), each student presents his/her prospect to the class in random order and completes an evaluation form on all other prospects. In a class of 30, this process takes five class periods. At the end of each, the class votes for Best Prospect of the Day. When all presentations are complete, the daily winners give a brief review and the class votes for Best Prospect. To make it fun, daily winners get a small surprise gift and the Best Prospect is awarded a trophy I buy at a local charity thrift store. The fall 2010 Best Prospect was Ambrosia and the trophy was a ceramic owl signifying wisdom. The last requirement of each student is to turn in one evaluation form I randomly specify. Not knowing which will be asked for, they must attend and stay engaged for all presentations.

This is a lot of detail, but the details make teaching effective or not. What does the student have at the end of this course? A respect for deadlines, the ability to jumpstart into complex software, knowledge of petroleum prospecting as well as structural, stratigraphic, and amplitude interpretation of modern 3D seismic data, and tangible evidence of this knowledge in the form of a presentation to show recruiters.

Some classes do not fit into the workshop format. My other recurring graduate class is Geophysical Data Analysis, a survey course on the physics and computing methods behind seismic data processing. The class will open with each student receiving a blank paper on which to write their name and a number called out as we head-count around the room. The papers are collected and I show a list of numbered topics (statics, fractures, velocity analysis, etc.). Thus every student is assigned a random subject and I emphasize that if they have already studied the subject they will get a different one. There are three assignments related to the subject: 1) write an SEG 4-page abstract, 2) prepare a 15 min presentation, and 3) prepare a one-panel poster. For full credit the presented work must include a computational aspect done by the student. Hard deadlines are staged for each item in the order shown. The first 70% of the course is lecture format and the remainder is presentations (again random, with co-evaluation forms, and single random turn-in).

In this way the traditional material is taught along with the method and capability for quickly digging into a new subject (including bibliography, state of the field, recent progress), and students learn to present the new knowledge to a group of peers.

Another theme in my teaching is exposure to open source software. Research often uses powerful commercial software such as Matlab and Mathematica, as well as compiled languages. However, free open source software like SeismicUnix, ImageJ and MeVisLab are launchpads for building custom tools in an advanced development environment. I maintain an open source blog to help students get started.

In summary, I enjoy the variety and constant challenge of teaching, and my methods will continue to evolve.

------------------------- Selected Student Comments -------------------------


Research Statement

At the University of Houston, most of my students work on some form of seismic interpretation topic.  This can be fairly basic (horizon mapping in 3D seismic for CO2 sequestration) or very advanced (seismic simulation from flow simulator output, or deep amplitude anomalies in the Gulf of Mexico).

My published work includes a textbook, peer reviewed publications, dozens of meeting abstracts, and many general-audience articles (World Oil contributing editor for 2010). There are two central themes to my peer-review published body of work.

First is the broad field of wave propagation, including phenomena, numerical modeling, and inversion.  The second theme is digital data analysis including seismic data processing, multichannel time series analysis, and image processing.

My earliest work centered on a wave equation framework for prestack partial migration (also known as dip moveout, or DMO), a theory that formed important middle ground between poststack and prestack migration during the 1980s as computer power was ramping up to handle the full prestack imaging problem. Earlier researchers had established the kinematics, or travel time, aspects of DMO, but amplitude was not previously related to the physics of wave propagation.  A side benefit of my DMO work was an amplitude preserving relationship for inverse DMO that can be used for interpolation or regularization of prestack data.

In the theory of DMO amplitude, one comes across the idea of 2.5 dimensional wave propagation.  The concept is a wave in the coordinate system of a 2D seismic line, but having accurate 3D amplitude behavior. In the course of my investigations, I discovered a 2.5D wave equation with exactly these properties.  It turned out to be a form of the Klein-Gordon equation well known in quantum mechanics.  It is not often one finds an unpublished, fundamental wave equation in the field of classical wave theory.

Another discovery relates to Rayleigh waves. There were three ways of computing the wave speed for the constant-parameter isotropic case: Analytically solve the Rayleigh polynomial, numerically solve it, or use a rough approximation that Rayleigh wave speed is 92% of the shear wave speed. The first two are tricky because the polynomial has multiple roots that may contribute to the solution, while the 92% rule is not accurate. I was able to expand the Rayleigh polynomial about the 92% rule and derive an expression for the wave speed that is accurate across the entire range of parameters encountered in seismic exploration. The new expression is far better than the 92% rule and more straightforward than computing Rayleigh's polynomial.

In 2002 I became interested in time-frequency methods and their application to seismic processing and analysis.  This is the topic of my SEG Distinguished Instructor Short Course (DISC) to be given worldwide in 2012, including a book in preparation with the working title of Seismic Dispersion.  Working in 2004 with PhD student Chun-Feng Li (now Tongji University), the continuous wavelet transform was used to generate a seismic attribute we termed SPICE (spectral imaging of correlative events). The idea is computation of a pointwise estimate of singularity strength (Hölder exponent) at every time level and every trace in a 2D or 3D migrated data volume.  The result is a remarkable view of geologic features in the data that are difficult or impossible to interpret otherwise. SPICE was patented by the University of Tulsa and commercialized by Fairfield Industries.

In a series of papers between 2006 and 2009 I worked with Saudi Aramco colleagues on layer-induced anisotropy, anisotropic prestack depth migration, and near-surface parameter estimation. In particular, I was able to show modern full wave sonic logs, that deliver both P-wave and S-wave velocities, are ideally suited to estimation of anisotropy parameters through a method originally published by Backus in 1962. The estimated anisotropy parameters were shown to improve prestack depth migration results.

A fundamental area of current research is carbon capture and sequestration. In 2008 I stepped in as principal investigator on a CO2 sequestration site characterization project and have since been lead CO2 sequestration researcher at the University of Houston. Our DOE-funded study site in Ness County, Kansas, involves 3D seismic, over 140 wells, digital well logs, production data, etc. My CO2 research team has progressed from seismic interpretation through geological model building, to scenario testing (cap rock integrity, fault and well bore leakage), and flow simulation spanning several hundred years. Site characterization and monitoring will rely heavily on geophysics as carbon capture begins on a large scale in the United States and worldwide. In 2010 I initiated research coordination with the CIUDEN carbon capture and sequestration project in Spain, one of the largest in Europe.

Time-frequency methods are central to ongoing areas of research.  In a recent paper with B. Bodmann, we re-examined a 1937 result by A. Wolf showing that reflection from a vertical transition zone results in a frequency-dependent reflection coefficient. Using modern analytical tools and methods, we showed how such phenomena could be detected and mined for important information. Another significant time-frequency application I developed is direct imaging of group velocity dispersion curves for observed data in shallow water settings around the world.  Phase velocity dispersion curves have been imaged since 1981, but this was the first reported method to image the group velocity curves that contain rich information about seafloor properties.

A theory of full-bandwidth signal recovery from zero-crossings of the short-time Fourier transform is now under development in collaboration with B. Bodmann. The first paper was delivered at a 2010 American Mathematical Society meeting.

My research will continue in the broad field of advanced seismic interpretation, wave propagation, signal analysis, and time-frequency methods to develop new tools and deeper insight into petroleum seismic data, CO2 sequestration problems, and near surface characterization.

Tuesday, June 8, 2010

House of Mirrors

[Note: A version of this blog entry will appear in World Oil (June, 2010)]

Let's take a ride on a seismic wave. The setting is offshore and a cylindrical steel airgun is just now charging up, the pressure building till a signal triggers the release of compressed air. The gas expands rapidly, generating a bubble and pressure wave in the water; a seismic wave is born. The wave takes off in every direction, but let's follow it straight down toward the seafloor.

This wave has a certain amplitude, the excess pressure above ambient conditions as it passes by, and a corresponding amount of energy. Striking the seafloor, it splits into two parts, an upgoing reflected wave and a downgoing transmitted wave. Whether the seafloor is hard or soft, not much of the wave passes through the water-sediment interface. This may seem surprising since we have all seen those beautiful 3D seismic offshore images, and that must come from waves that made it through the seafloor. But the fact is that much of the wave action does not get through.

Even in the case of a muddy, soft seafloor, something like 45% of the wave is reflected back up into the water. For a hard seafloor, we can expect 60% or more of the wave to reflect. Now remember, we are following a vertical wave so this reflection heads back toward the ocean surface. The trip upward is not very exciting till the wave hits the water/air interface (or 'free surface').

You might think that with air being so compressible and low density compared to water, that the sound wave would pass right on through. But, in fact, the opposite is true. Nothing gets through, and the wave reflects at full strength back to the seafloor, half of it bounces there, then all of it reflects again from the free surface, and so on. The multiples are periodic, each round trip taking the same amount of time. Have you ever been on an elevator with mirrors on two sides? Remember the infinite copies of yourself that that peer back? That is the situation a receiver in the water sees as these waves ping back and forth between seafloor and surface. These are called free-surface multiples since they are bouncing off the air/water interface. Another kind of multiple can occur deeper in the earth, where extra bounces are taken in one or more layers. These are termed internal multiples.

Multiples of any kind are big trouble to seismic imaging for a couple of reasons. First, it is not hard to imagine that our free-surface multiple will continue for a long as we record. That means eventually there will be a weak reflection we want from somewhere deep in the earth, and a multiple is likely to crash right into it. It is all too common around the world that our delicate, carefully nurtured reservoir reflection sits under a big, strong multiple. The second reason imaging suffers, is more subtle. For half a century researchers have been devising ever better ways to image (or migrate) seismic data. But almost all of them have one thing in common: Only primary (one bounce) reflections are used for imaging, not multiples. So our data is full of multiples, but we consider it noise not signal. That means multiples have to be removed before migration.

So how do we get rid of these multiples? The classic method is deconvolution, a word with a lot of meaning. It can be used for things like spectral whitening, source signature removal, wave-shaping, and many others. The application of decon to multiple removal goes back to the earliest days of digital seismic processing. The method also goes by the more descriptive name of prediction error filtering.

As an analogy, imagine that you are walking across a parking lot on a completely dark night. As you pace along, you suddenly bump your toe on a curb. Stepping over it, you continue and hit another curb farther on, and another. After a while, you start counting steps and find you can predict the next curb; not perfectly at first, but you get better and finally can avoid them perfectly. In effect, this is what the mathematical machinery of deconvolution does when processing a seismic trace. Based on earlier data values, decon predicts what will come a few time steps ahead, then goes there, checks the value, updates the prediction, and keeps trying. Since our multiple is periodic, decon will be able to predict it and thus remove it from the data. Reflections due to deeper geology, however, are by nature unpredictable and will not be removed. Decon has the remarkable ability to find the multiple pattern, or patterns, embedded in a random sequence of geological reflections. Very clever.

Going back to the dark parking lot, what if we change the situation so the repeating curbs are very far apart. So far, in fact, that as you walk across the lot you only hit one or two. In that case you would not be able to figure out the repeat pattern because you get too few looks at it. In the same way, decon is great for short-period multiples from shallow water, but fails for long-period multiples that occur in deep water. Over the last 20 years or so, a completely different multiple removal technique has been developed for exactly this situation. It has the cumbersome name of surface-related multiple elimination (SRME).

SRME is a vital tool in the modern exploration for deep water resources. It can handle much more complicated cases than the simple, vertical multiple we described above. SRME can handle situations where decon breaks down, like seafloor topography, non-vertical waves, and 3D scattering effects. The way SRME works is by considering the raypath for a particular free-surface multiple, and breaking it down to look like two primary reflections glued together. This insight allows a free-surface multiple that shows up only one time to be neatly and effectively removed. It has lead to a vast improvement in detailed image quality in the deep water Gulf of Mexico and elsewhere.