.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "tutorials/ImportLocalDatabase.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_tutorials_ImportLocalDatabase.py: Import from Local DataBase ========================== This Tutorial illustrates how to import waveform data from a local database of continuous waveforms into PyGLImER. As we don't have actual offline data available, we will download some continuous data from two stations: station IU-HRV ([Adam Dziewonski Observatory](http://www.seismology.harvard.edu/hrv.html)) and the Dutch station NL-HGN. This data will then be sliced into times, when arrivals from teleseismic events are expected, which PyGLiMER determines in a previous step. Downloading Event Catalogue & Feed in offline data -------------------------------------------------- Here, we will again use the :class:`pyglimer.waveform.request.Request` class. The first method from this class that we are going to use is the download event catalog public method :func:`pyglimer.waveform.request.Request.download_evtcat`, to get a set of events that contains all wanted earthquakes. This method is launched automatically upon initialization. (Same as in the download tutorials) To initialize said `class` we set up a parameter dictionary, with all the needed information. Let's look at the expected information: .. GENERATED FROM PYTHON SOURCE LINES 32-35 .. code-block:: default # sphinx_gallery_thumbnail_number = 1 # sphinx_gallery_dummy_images = 1 .. GENERATED FROM PYTHON SOURCE LINES 36-37 First let's get a path where to create the data. .. GENERATED FROM PYTHON SOURCE LINES 37-80 .. code-block:: default # Some needed Imports import os from typing import List from obspy import UTCDateTime, read, read_inventory import obspy from pyglimer.waveform.request import Request # Get notebook path for future reference of the database: try: db_base_path = ipynb_path except NameError: try: db_base_path = os.path.dirname(os.path.realpath(__file__)) except NameError: db_base_path = os.getcwd() # Define file locations proj_dir = os.path.join(db_base_path, 'tmp', 'database_sac_import') request_dict = { # Necessary arguments 'proj_dir': proj_dir, 'raw_subdir': os.path.join('waveforms', 'raw'),# Directory of the waveforms 'prepro_subdir': os.path.join('waveforms', 'preprocessed'), # Directory of the preprocessed waveforms 'rf_subdir': os.path.join('waveforms', 'RF'), # Directory of the receiver functions 'statloc_subdir': 'stations', # Directory stations 'evt_subdir': 'events', # Directory of the events 'log_subdir': 'log', # Directory for the logs 'loglvl': 'DEBUG', # logging level, for more info use 'INFO' or 'DEBUG' 'format': 'sac', # Format to save database in "phase": "P", # 'P' or 'S' receiver functions "rot": "RTZ", # Coordinate system to rotate to "deconmeth": "waterlevel", # Deconvolution method "starttime": UTCDateTime(2021, 1, 10, 3, 0, 0), # Starttime of your data. "endtime": UTCDateTime(2021, 1, 10, 5, 0, 0), # Endtimetime of your data # kwargs below "pol": 'v', # Source wavelet polaristion. Def. "v" --> SV "minmag": 5.0, # Earthquake minimum magnitude. Def. 5.5 "event_coords": None, # Specific event?. Def. None "evtcat": None, # If you have already downloaded a set of # events previously, you can use them here } .. GENERATED FROM PYTHON SOURCE LINES 81-83 Now that all parameters are in place, let's initialize the :class:`pyglimer.waveform.request.Request` .. GENERATED FROM PYTHON SOURCE LINES 83-87 .. code-block:: default # Initializing the Request class and downloading the data R = Request(**request_dict) .. rst-class:: sphx-glr-script-out .. code-block:: none 2023-03-29 10:32:48,905 - INFO - Successfully obtained 1 events .. GENERATED FROM PYTHON SOURCE LINES 88-90 The initialization will look for all events for which data is available. To see whether the events make sense we plot a map of the events: .. GENERATED FROM PYTHON SOURCE LINES 90-101 .. code-block:: default import matplotlib.pyplot as plt from pyglimer.plot.plot_utils import plot_catalog from pyglimer.plot.plot_utils import set_mpl_params # Setting plotting parameters set_mpl_params() # Plotting the catalog plot_catalog(R.evtcat) .. image-sg:: /tutorials/images/sphx_glr_ImportLocalDatabase_001.png :alt: ImportLocalDatabase :srcset: /tutorials/images/sphx_glr_ImportLocalDatabase_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 102-103 We can also quickly check how many events we gathered. .. GENERATED FROM PYTHON SOURCE LINES 103-106 .. code-block:: default print(f"There are {len(R.evtcat)} available events") .. rst-class:: sphx-glr-script-out .. code-block:: none There are 1 available events .. GENERATED FROM PYTHON SOURCE LINES 107-118 Preliminary steps ----------------- This will be the most complex step because it requires you to write two functions: 1. A function that yields Obspy Streams. 2. A function that yields obspy inventories. ***WARNING:*** both have to yield information for the same station. So both Generators also need to have the same length. These functions could for example look like this: .. GENERATED FROM PYTHON SOURCE LINES 118-130 .. code-block:: default def yield_st_dummy(list_of_waveform_files: List[os.PathLike]): for file in list_of_waveform_files: yield obspy.read(file) def yield_inventory_dummy(list_of_station_files: List[os.PathLike]): for file in list_of_station_files: yield obspy.read_inventory(file) .. GENERATED FROM PYTHON SOURCE LINES 131-149 ***NOTE:*** This also requires you to convert/compile your information Into formats that obspy can read. To create StationXMLs follow the following tutorial: ``_ The StationXML needs to contain the following information: 1. Network and Station Code 2. Latitude, Longitude, and Elevation 3. Azimuth of Channel/Location to do the Rotation. 4. (Optional/if you set remove_response=True) Station response information. The header of the traces need to contain the following: 1. sampling_rate 2. start_time, end_time 3. Network, Station, and Channel Code (Location code arbitrary) To convert seismic data from unusual formats to mseed or sac, we recommend using PyROCKO or obspy. We will use two hours of data that come with PyGLImER .. GENERATED FROM PYTHON SOURCE LINES 149-171 .. code-block:: default def yield_st(): static = os.path.join( db_base_path, 'static_data', 'database_sac', 'waveforms', 'local_db') networks = ['IU'] stations = ['HRV'] for net, stat in zip(networks, stations): st = read(os.path.join(static, net, stat, 'hrv10.mseed')) yield st def yield_inv(): static = os.path.join( db_base_path, 'static_data', 'database_sac', 'waveforms', 'local_db') networks = ['IU'] stations = ['HRV'] for net, stat in zip(networks, stations): st = read_inventory(os.path.join(static, net, stat, 'hrv_stat.xml')) yield st .. GENERATED FROM PYTHON SOURCE LINES 172-182 Import waveform data and station information -------------------------------------------- The hard part is done! The actual import into PyGLImER is easy now: To do so, we use the public method of `Request`: `import_database()` This method will do the following: 1. Find times with teleseismic arrivals of our desired phase. 2. Slice time windows around this arrivals. 3. Do a first fast preprocessing. 4. Save Traces and station information in desired `format` (mseed or asdf) .. GENERATED FROM PYTHON SOURCE LINES 182-185 .. code-block:: default R.import_database(yield_st, yield_inv) .. rst-class:: sphx-glr-script-out .. code-block:: none 2023-03-29 10:32:49,190 - INFO - Computing theoretical times of arrival. Save Inventories 2023-03-29 10:32:49,198 - INFO - Checking IU.HRV 2023-03-29 10:32:49,198 - DEBUG - IU.HRV 2023-03-29 10:32:49,221 - INFO - Slice data in chunks and save them into PyGLImER database. .. GENERATED FROM PYTHON SOURCE LINES 186-187 Let's just check how many teleseismic arrivals were found in this one week. .. GENERATED FROM PYTHON SOURCE LINES 187-197 .. code-block:: default from glob import glob # Path to the where the miniseeds are stored data_storage = os.path.join( proj_dir, 'waveforms', 'raw', 'P', '**', '*.mseed') # Print output print(f"Number of found teleseismic arrivals: {len(glob(data_storage))}") .. rst-class:: sphx-glr-script-out .. code-block:: none Number of found teleseismic arrivals: 1 .. GENERATED FROM PYTHON SOURCE LINES 198-226 ***NOTE*** From here on, the steps are identical to the download tutorials The final step to get you receiver function data is the preprocessing. Although it is hidden in a single function, which is :func:`pyglimer.waveform.request.Request.preprocess` A lot of decisions are being made: Processing steps: 1. Clips waveform to the right length (tz before and ta after theorethical arrival.) 2. Demean & Detrend 3. Tapering 4. Remove Instrument response, convert to velocity & simulate havard station 5. Rotation to NEZ and, subsequently, to RTZ. 6. Compute SNR for highpass filtered waveforms (highpass f defined in qc.lowco) If SNR lower than in qc.SNR_criteria for all filters, rejects waveform. 7. Write finished and filtered waveforms to folder specified in qc.outputloc. 8. Write info file with shelf containing station, event and waveform information. 9. (Optional) If we had chosen a different coordinate system in ``rot`` than RTZ, it would now cast the preprocessed waveforms information that very coordinate system. 10. Deconvolution with method ``deconmeth`` from our dict is perfomed. It again uses the request class to perform this. The ``if __name__ ...`` expression is needed for running this examples .. GENERATED FROM PYTHON SOURCE LINES 226-229 .. code-block:: default R.preprocess(hc_filt=1.5, client='single') .. rst-class:: sphx-glr-script-out .. code-block:: none 2023-03-29 10:32:49,245 - INFO - ...Preprocessing initialiased... 2023-03-29 10:32:49,245 - DEBUG - USING SINGLE CORE BACKEND 0%| | 0/1 [00:00 .. GENERATED FROM PYTHON SOURCE LINES 288-289 Let's zoom into the first 20 seconds (~200km) .. GENERATED FROM PYTHON SOURCE LINES 289-292 .. code-block:: default rftrace.plot(lim=[0, 20]) .. image-sg:: /tutorials/images/sphx_glr_ImportLocalDatabase_003.png :alt: ImportLocalDatabase :srcset: /tutorials/images/sphx_glr_ImportLocalDatabase_003.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none .. GENERATED FROM PYTHON SOURCE LINES 293-298 Plot RF section +++++++++++++++ Since we have an entire stream of receiver functions at hand, we can plot a section .. GENERATED FROM PYTHON SOURCE LINES 298-301 .. code-block:: default rfstream.plot(scalingfactor=1) .. image-sg:: /tutorials/images/sphx_glr_ImportLocalDatabase_004.png :alt: ImportLocalDatabase :srcset: /tutorials/images/sphx_glr_ImportLocalDatabase_004.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none .. GENERATED FROM PYTHON SOURCE LINES 302-304 Similar to the single RF plot we can provide time and epicentral distance limits: .. GENERATED FROM PYTHON SOURCE LINES 304-311 .. code-block:: default timelimits = (0, 20) # seconds epilimits = (32, 36) # epicentral distance rfstream.plot( scalingfactor=0.25, lim=timelimits, epilimits=epilimits, linewidth=0.75) .. image-sg:: /tutorials/images/sphx_glr_ImportLocalDatabase_005.png :alt: ImportLocalDatabase :srcset: /tutorials/images/sphx_glr_ImportLocalDatabase_005.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none .. GENERATED FROM PYTHON SOURCE LINES 312-314 By increasing the scaling factor and removing the plotted lines, we can already see trends: .. GENERATED FROM PYTHON SOURCE LINES 314-320 .. code-block:: default rfstream.plot( scalingfactor=0.5, lim=timelimits, epilimits=epilimits, line=False) .. image-sg:: /tutorials/images/sphx_glr_ImportLocalDatabase_006.png :alt: ImportLocalDatabase :srcset: /tutorials/images/sphx_glr_ImportLocalDatabase_006.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-script-out .. code-block:: none .. GENERATED FROM PYTHON SOURCE LINES 321-322 As simple as that you can create your own receiver functions with just a single smalle script. .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 1.844 seconds) .. _sphx_glr_download_tutorials_ImportLocalDatabase.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: ImportLocalDatabase.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: ImportLocalDatabase.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_