Skip to content

Commit

Permalink
Adding Examples
Browse files Browse the repository at this point in the history
  • Loading branch information
miili authored and Marius Isken committed Jul 1, 2022
1 parent 49a5174 commit 8d0688c
Show file tree
Hide file tree
Showing 15 changed files with 292 additions and 106 deletions.
20 changes: 8 additions & 12 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,15 +1,11 @@
# See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.0.1
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-added-large-files
- repo: https://github.com/ambv/black
rev: 21.6b0
hooks:
- id: black
# language_version: python3.6
- repo: local
hooks:
- id: black
name: black
language: system
types: [python]
entry: black
# language_version: python3.6
20 changes: 14 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,9 @@

*Tools for distributed acoustic sensing and modelling.*

[![PyPI version](https://badge.fury.io/py/lightguide.svg)](https://badge.fury.io/py/lightguide) [![DOI](https://zenodo.org/badge/495774991.svg)](https://zenodo.org/badge/latestdoi/495774991)
[![PyPI version](https://badge.fury.io/py/lightguide.svg)](https://badge.fury.io/py/lightguide)
<a href="https://github.com/psf/black"><img alt="Code style: black" src="https://img.shields.io/badge/code%20style-black-000000.svg"></a>
[![DOI](https://zenodo.org/badge/495774991.svg)](https://zenodo.org/badge/latestdoi/495774991)

Lightguide is a package for handling, filtering and modelling distributed acoustic sensing (DAS) data. The package interfaces handling and processing routines of DAS data to the [Pyrocko framework](https://pyrocko.org). Through Pyrocko's I/O engine :rocket: lightguide supports handling the following DAS data formats:

Expand All @@ -29,12 +31,18 @@ pip install lightguide
The adaptive frequency filter (AFK) can be used to suppress incoherent noise in DAS data sets.

```python
from lightguide import orafk_filter
from lightguide import filters
from lightguide.utils import download_numpy, ExampleData

filtered_data = orafk_filter.afk_filter(
data, window_size=32, overlap=15, exponent=0.8, normalize_power=False)

das_data = download_numpy(ExampleData.VSPData)

filtered_data = filters.afk_filter(
das_data, window_size=32, overlap=15, exponent=0.8, normalize_power=False)
```

[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/pyrocko/lightguide/blob/master/examples/1-denoise-DAS-data.ipynb)

The filtering performance of the AFK filter, applied to an earthquake recording at an [ICDP](https://www.icdp-online.org/home/) borehole observatory in Germany. The data was recorded on a [Silixa](https://silixa.com/) iDAS v2. For more details see <https://doi.org/10.5880/GFZ.2.1.2022.006>.

![AFK Filter Performance](https://user-images.githubusercontent.com/4992805/170084970-9484afe7-9b95-45a0-ac8e-aec56ddfb3ea.png)
Expand All @@ -45,11 +53,11 @@ The filtering performance of the AFK filter, applied to an earthquake recording

Lightguide can be cited as:

> Isken, Marius Paul; Christopher, Wollin; Heimann, Sebastian; Dahm, Torsten (2022): Lightguide - Tools for distributed acoustic sensing.
> Marius Paul Isken, Sebastian Heimann, Christopher Wollin, Hannes Bathke, & Torsten Dahm. (2022). Lightguide - Seismological Tools for DAS data. Zenodo. https://doi.org/10.5281/zenodo.6580579
Details of the adaptive frequency filter are published here:

> Isken, Marius Paul; Vasyura-Bathke, Hannes; Dahm, Torsten; Heimann, Sebastian (2022): De-noising distributed acoustic sensing data using an adaptive frequency-wavenumber filter, Geophysical Journal International.
> Marius Paul Isken, Hannes Vasyura-Bathke, Torsten Dahm, Sebastian Heimann, De-noising distributed acoustic sensing data using an adaptive frequency-wavenumber filter, Geophysical Journal International, 2022;, ggac229, https://doi.org/10.1093/gji/ggac229
## Packaging

Expand Down
95 changes: 95 additions & 0 deletions examples/1-denoise-DAS-data.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Lightguide Adaptive-Frequency-Filter\n",
"\n",
"In this notebook we de-noise an awefully noisy DAS data set.\n",
"\n",
"The vertical seismic profile shot was recorded in an ICDP borehole in Landwüst, Germany. The fibre is interrogated by a Silixa iDAS v2 and is cemented behind the casing."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"!pip install lightguide"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"First we import the filter function from the `lightguide` module."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from lightguide import filters\n",
"from lightguide.utils import download_numpy, ExampleData\n",
"\n",
"\n",
"das_data = download_numpy(ExampleData.VSPData)\n",
"\n",
"filtered_data = filters.afk_filter(\n",
" das_data, window_size=32, overlap=15, exponent=0.8, normalize_power=False)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import matplotlib.pyplot as plt\n",
"\n",
"fig, axes = plt.subplots(2, 1, figsize=(10, 10))\n",
"\n",
"ax = axes[0]\n",
"ax.set_title(\"Raw Data\")\n",
"ax.imshow(das_data, aspect=\"auto\", interpolation=\"none\")\n",
"\n",
"ax = axes[1]\n",
"ax.set_title(\"Filtered Data\")\n",
"ax.imshow(filtered_data, aspect=\"auto\", interpolation=\"none\")\n",
"\n",
"fig.show()\n"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3.9.2 ('venv': venv)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.2"
},
"orig_nbformat": 4,
"vscode": {
"interpreter": {
"hash": "d6c2f566028469748ec83127a93d0ab1f5f314b33991e2ad2ee6fe2fd330af93"
}
}
},
"nbformat": 4,
"nbformat_minor": 2
}
5 changes: 4 additions & 1 deletion lightguide/__init__.py
Original file line number Diff line number Diff line change
@@ -1 +1,4 @@
from . import lightguide as orafk_filter # noqa
import pkg_resources


__version__ = pkg_resources.get_distribution("lightguide").version
34 changes: 4 additions & 30 deletions lightguide/afk_filter.py → lightguide/afk_filter_python.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,40 +5,14 @@

import numpy as np
import numpy.typing as npt
from scipy.signal import butter, lfilter


data = np.load(Path(__file__).parent / "data" / "data-DAS-gfz2020wswf.npy")


def butter_bandpass_filter(
data: npt.NDArray,
lowcut: float,
highcut: float,
sampling_rate: float,
order: int = 4,
) -> npt.NDArray:
"""Butterworth bandpass filter for 2D time-spatial array.
Args:
data (npt.NDArray): Input data as [time, space]
lowcut (float): Low-cut frequency in [Hz].
highcut (float): High-cut frequency in [Hz].
sampling_rate (float): Sampling rate along the time dimension.
order (int, optional): Order of the filter. Defaults to 4.
Returns:
npt.NDArray: _description_
"""
coeff_b, coeff_a = butter(
order, (lowcut, highcut), btype="bandpass", fs=sampling_rate
)
y = lfilter(coeff_b, coeff_a, data, axis=0)
return y


@lru_cache
def triangular_taper(size: int, plateau: int) -> npt.NDArray:
def triangular_taper_python(size: int, plateau: int) -> npt.NDArray:

if plateau > size:
raise ValueError("Plateau cannot be larger than size.")
if size % 2 or plateau % 2:
Expand All @@ -52,7 +26,7 @@ def triangular_taper(size: int, plateau: int) -> npt.NDArray:
return window * window[:, np.newaxis]


def afk_filter(
def afk_filter_python(
data: npt.NDArray,
window_size: int = 32,
overlap: int = 14,
Expand All @@ -74,7 +48,7 @@ def afk_filter(
raise ValueError("Padding does not match desired data shape")

filtered_data = np.zeros_like(data)
taper = triangular_taper(window_size, window_non_overlap)
taper = triangular_taper_python(window_size, window_non_overlap)

for iwin_x in range(nwin_x):
px_x = iwin_x * window_stride
Expand Down
2 changes: 2 additions & 0 deletions lightguide/filters.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
from .lightguide import * # noqa
from .afk_filter_python import * # noqa
95 changes: 92 additions & 3 deletions lightguide/utils.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,29 @@
from __future__ import annotations
from enum import Enum
from tempfile import SpooledTemporaryFile

import time
import logging
from functools import wraps
from pathlib import Path
from typing import Any, Callable

from pyrocko.trace import Trace
import requests

import numpy as np
import numpy.typing as npt
from pyrocko.trace import Trace
from scipy.signal import butter, lfilter


# create console handler
ch = logging.StreamHandler()
formatter = logging.Formatter("\x1b[80D\x1b[1A\x1b[K%(message)s")
ch.setFormatter(formatter)


class ExampleData:
VSPData = "https://data.pyrocko.org/testing/lightguide/das-data.npy"
EQData = "https://data.pyrocko.org/testing/lightguide/data-DAS-gfz2020wswf.npy"


class AttrDict(dict):
Expand All @@ -15,11 +33,25 @@ def __init__(self, *args, **kwargs) -> None:


def traces_to_numpy_and_meta(traces: list[Trace]) -> tuple[npt.NDArray, AttrDict]:
"""Geneare a numpy 2-D array from a list of traces
Args:
traces (list[Trace]): List of input traces
Raises:
ValueError: Raised when the traces have different lengths / start times.
Returns:
tuple[npt.NDArray, AttrDict]: The waveform data and meta information as dict.
"""
if not traces:
raise ValueError("No traces given")
ntraces = len(traces)
nsamples = set(tr.ydata.size for tr in traces)

if len(nsamples) != 1:
raise ValueError("Traces nsamples differ")
raise ValueError("Traces number of samples differ.")

nsamples = nsamples.pop()

data = np.zeros((ntraces, nsamples))
Expand All @@ -31,7 +63,64 @@ def traces_to_numpy_and_meta(traces: list[Trace]) -> tuple[npt.NDArray, AttrDict
return data, AttrDict(meta)


def butter_bandpass_filter(
data: npt.NDArray,
lowcut: float,
highcut: float,
sampling_rate: float,
order: int = 4,
) -> npt.NDArray:
"""Butterworth bandpass filter for 2D time-spatial array.
Args:
data (npt.NDArray): Input data as [time, space]
lowcut (float): Low-cut frequency in [Hz].
highcut (float): High-cut frequency in [Hz].
sampling_rate (float): Sampling rate along the time dimension.
order (int, optional): Order of the filter. Defaults to 4.
Returns:
npt.NDArray: Filtered wavefield.
"""
coeff_b, coeff_a = butter(
order, (lowcut, highcut), btype="bandpass", fs=sampling_rate
)
y = lfilter(coeff_b, coeff_a, data, axis=0)
return y


def download_http(url: str, target: Path | SpooledTemporaryFile):
req = requests.get(url)
req.raise_for_status()
total_size = int(req.headers.get("Content-Length", 0))
nbytes = 0

if isinstance(target, Path):
writer = target.write_bytes
elif isinstance(target, SpooledTemporaryFile):
writer = target.write
else:
raise TypeError("Bad target for download")

for data in req.iter_content(chunk_size=4096):
nbytes += len(data)
print(f"\u001b[2KDownloading {url}: {nbytes}/{total_size} bytes", end="\r")

writer(data)
print(f"\u001b[2KDownloaded {url}")


def download_numpy(url: str) -> np.ndarray:
file = SpooledTemporaryFile()
download_http(url, target=file)
file.flush()
file.seek(0)
return np.load(file)


def timeit(func: Callable) -> Callable:
"""A helper decorator to time function execution."""

@wraps(func)
def wrapper(*args, **kwargs) -> Any:
t = time.time()
Expand Down
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ authors = [
urls = {GitHub = "https://github.com/pyrocko/lightguide"}
keywords = ["distributed acoustic sensing", "DAS", "seismology", "earthquake modelling"]
license = {file = "LICENSE"}
dependencies = ["pyrocko>=2022.4.28", "numpy>=1.20.0"]
dependencies = ["pyrocko>=2022.4.28", "numpy>=1.20.0", "requests>=2.20.0"]
readme = "README.md"
classifiers = [
"Programming Language :: Rust",
Expand Down
3 changes: 1 addition & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,13 +24,12 @@
],
rust_extensions=[
RustExtension(
"lightguide.orafk_filter",
"lightguide.afk_filter",
path="Cargo.toml",
binding=Binding.PyO3,
debug=False,
)
],
package_data={"lightguide": ["data/*.npy"]},
packages=["lightguide"],
python_requires=">=3.8",
)
File renamed without changes.
Loading

0 comments on commit 8d0688c

Please sign in to comment.