Department Logo for Earth, Planetary, and Space Sciences

Array-based seismic waveform coherency measurement, simulation, and application in evaluating back-projections - & - The Importance of Paleobathymetry in Understanding the Long-Term Carbon Cycle through Variations in Carbonate Compensation


Oct. 28, 2020, noon - 1 p.m.
---

Presented By:
Tong Zhou,
&
Matthew Bogumil

See Event on Google. Subscribe to Calendar

Tong Zhou: Array-based seismic waveform coherency measurement, simulation, and application in evaluating back-projections Array-based seismic source imaging methods, e.g., back-projections, are routinely applied to the large earthquake investigation and earthquake hazard assessment. The back projection (BP) method locates the high frequency energy radiators in the fault region with the alignment and stacking of the coherent seismic phases, which represents the fault rupturing process. However, what the BP radiators actually images is still under debate, along with the accuracy and resolution of the BPs, especially when waveform complexity presents. Two candidates that are sensitive to BPs include the slip acceleration and the rupture speed change. To test the BP methods and distinguish what fault properties the BP most sensitive to, we need to simulate the real incoherency waveforms. In this work, we first measure the coherence fluctuation with time, frequency and interstation distance in an array with moderate size earthquakes which are supposed to be treated as point sources at teleseismic distance. Then we propose two methods: (1) multiple plane waves and (2) multiple Born scatterers to fit the fluctuation of coherence statistically. Such semi-analytical methods capture the statistical features of the incoherent coda waves, and save significant computation time compared to numerical simulations. With this method, we are able to test the resolvability of various issues including earthquake nucleation, bilateral propagation, fault jumping, supershear transition, etc. and evolve the methods for improving BP images, e.g., slowness calibration, artifacts mitigation, and even to build possible relationships between asperity sizes and coherence fluctuations. /////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////// Matthew Bogumil: The Importance of Paleobathymetry in Understanding the Long-Term Carbon Cycle through Variations in Carbonate Compensation Depths throughout the Last 100Myr Seafloor spreading and cooling of oceanic lithosphere results in a constantly evolving bathymetry. First order changes occurred throughout the Cenozoic as plates speeds slowed by a factor of 2 and major plates reorganized. We evaluate the role of period-accurate bathymetry distributions at global and basin scales on the carbonate compensation depth (CCD). To analyze the effects of bathymetry on the carbon cycle we focus here on the Late Paleocene. We find a strong bathymetric dependence on the CCD at global and basin scale levels using the LOSCAR earth system model. Steady states snapshots at 60 Ma reveal that the Indian, Pacific, and Atlantic basin CCDs are ~1km deeper than previous estimates, while the Tethys CCD is over 2km deeper. Variations in the initial riverine flux, an uncertain climate parameter, potentially reconciles global CCD predictions with ocean core sample data. Our study demonstrates the need to reconcile the interpretative climate parameters used within climate modeling with realistic bathymetric reconstructions. The addition of evolving bathymetry proves to be necessary when studying the long-term climate and carbon cycles through models such as LOSCAR, GEOCLIM, and DCESS. Consequently, chosen bathymetry reconstructions need to be rationalized for, and amongst, climate studies to improve interpretations of these cycles.