preview

Hockley Fault In Houston

Decent Essays

Over time faults accumulate stress forces which are released as the earth masses glide past each other, although sometimes at slow rates, the release of such forces has damaging effects on a city’s infrastructure. The city of Houston is the 4th most populated city in the U.S., therefore a throughout understanding of its fault systems is crucial to ensure the infrastructure stability. ______________________________________________________________________________ Introduction The Hockley fault is underlying Highway 290, the Houston Premium Outlet mall, and various neighborhoods which will eventually be damaged by the fault’s movement. In 2013, the AGL crew acquired seismic data with the help of a mini-vibe. Processing Steps At first glance, …show more content…

The first was a simple bandpass filter, and after trial and error I chose the following parameters; 30 to 40 Hz on the low and 70 to 90 Hz on the high end. Following the bandpass I removed a spike in the signal at 60 Hz with a Notch filter ranging from 55 to 58 Hz and 62 to 65 Hz. My next step in processing was deconvolution. I applied a 120 ms Predictive Deconvolution but my resulting data was poor. Due to this I decided to continue processing without applying deconvolution. Figure 3: 120 ms Predictive deconvolution got rid of valuable data. Now, having a cleaner data my next step was to begin the Velocity Analysis process. I picked my velocities based on what flattened the data the best while staying true to the principle that velocity should increase with depth. Figure 4 displays a section of my velocity pickings while Figure 5 shows the resulting Velocity Section. Figure 4: Velocity Pickings Figure 5: Velocity Section. Once the data has been filtered and gone through the velocity analysis it is finally ready to be stacked. For this project a brute stack was used, this kind of stack adds all the flattened reflections through all the CMP’s to create one …show more content…

Further processing includes using migration, which transfers reflections to their correct locations in the x-y-time space of seismic data. I found that the clarity of this particular fault was greatly improved after applying migration. Figure 7: Migrated data zoomed in to 800 ms. After the data has been migrated the next step is to apply Time to Depth conversion. This causes the vertical axis which displays time in milliseconds (ms) to be converted into depth in meters (m). This process is extremely important in determining the depth or extent of geologic features such as this fault. Figure 8: Time to depth zoomed in to 800 meters. Interpretation Based on my processing I only started to see faults after the Brute Stack, although faintly. The main features which I assume were faults are the diagonal discontinuities which I have pointed in yellow arrows. Figure 9: Fault-like diagonal discontinuities in Brute Stack. Figure 10: Faults become more

Get Access