Version:

This documentation is not for the latest stable Salvus version.

This tutorial is presented as Python code running inside a Jupyter Notebook, the recommended way to use Salvus. To run it yourself you can copy/type each individual cell or directly download the full notebook, including all required files.

This notebook teaches how to compute gradients with respect to a mesh's material parameters using only SalvusFlow and not SalvusProject. SalvusProject internally handles many of the complexities but the approach presented here grants a bit more flexibility.

Copy
# Variable used in the notebook to determine which site
# is used to run the simulations.
import os
SALVUS_FLOW_SITE_NAME = os.environ.get("SITE_NAME", "local")
%matplotlib inline

import pathlib

import numpy as np
import matplotlib.pyplot as plt
import xarray as xr

import salvus.namespace as sn

## Step 1: Generate target and initial model

This example is purely synthetic so we'll generate a target model with a Gaussian blob perturbation in $v_s$ in the center of the domain. This will be used to generate the "observed data" in this case. The initial model (the synthetics for cause of this tutorial) is a homogeneous model without the blob.

# Use xarray to define the blob in vs.
def get_target_model():
x = np.linspace(-500, 500, 200)
y = np.linspace(-500, 500, 200)
xx, yy = np.meshgrid(x, y, indexing="ij")

# Simple gaussian blob around the center of the domain.
d = (xx ** 2 + yy ** 2) ** 0.5
vs = np.exp(-0.5 * (d / 100.0) ** 2) * 250.0 + 1000.0

ds = xr.Dataset(
data_vars={"vs": (["x", "y"], vs)}, coords={"x": x, "y": y},
)

return ds

target_model = get_target_model()
target_model.vs.plot()
<matplotlib.collections.QuadMesh at 0x7f4816af57d0>

Create a mesh - this is directly used as the initial model.

# Build mesh for the initial model.
initial_model_mesh = sn.simple_mesh.CartesianHomogeneousIsotropicElastic2D(
vp=1500.0,
vs=1000.0,
rho=2000.0,
x_max=float(target_model.x.data.ptp()),
y_max=float(target_model.y.data.ptp()),
max_frequency=10.0,
tensor_order=1,
).create_mesh()

# Shift.
initial_model_mesh.points[:, 0] += target_model.x.data.min()
initial_model_mesh.points[:, 1] += target_model.y.data.min()

Now interpolate the blob onto it thus creating the target mesh. Have a look at the perturbation in $v_s$.

target_model_mesh = initial_model_mesh.copy()

target_model_mesh = sn.toolbox.interpolate_cartesian(
data=target_model, mesh=target_model_mesh, extrapolate=True
)
target_model_mesh
When run interactively in a Jupyter Notebook, this output cell contains a widget. Click the button below to load a preview of it. Please note that most interactive functionality does not work without a running Python kernel.
<salvus.mesh.unstructured_mesh.UnstructuredMesh at 0x7f48109a7590>

This is also the starting point for setting up a project for the inversion. We can directly create the project from the model defined above. We will call this model the true_model.

## Step 2: Setup the simulation with a single source and a ring of receivers

# Explosive source in one corner of the mesh.
src = sn.simple_config.source.cartesian.MomentTensorPoint2D(
mxx=10000.0,
myy=10000.0,
mxy=0.0,
x=300.0,
y=300.0,
source_time_function=sn.simple_config.stf.Ricker(center_frequency=5.0),
)

# A centered ring of receivers.
)

# Create two simulation objects.
# (1) The target will serve as observed data.
w_target = sn.simple_config.simulation.Waveform(
)
w_target.physics.wave_equation.end_time_in_seconds = 1.0
w_target

# (2) The initial model will be used to compute synthetics.
w_initial = sn.simple_config.simulation.Waveform(
)
w_initial.physics.wave_equation.end_time_in_seconds = 1.0

# Must store the checkpoints for the subsequent adjoint run.
w_initial.output.volume_data.format = "hdf5"
w_initial.output.volume_data.filename = "output.h5"
w_initial.output.volume_data.sampling_interval_in_time_steps = (
"auto-for-checkpointing"
)

## Step 3: Run Simulations

For the target/observed model and the initial/current model.

# Create the "observed data".
j_target = sn.api.run(
input_file=w_target,
site_name=SALVUS_FLOW_SITE_NAME,
output_folder="output_target",
overwrite=True,
)

# Forward run for the synthetics.
j_initial = sn.api.run(
input_file=w_initial,
site_name=SALVUS_FLOW_SITE_NAME,
output_folder="output_initial",
overwrite=True,
# Don't delete the remote files as they are needed
delete_remote_files=False,
)
Job job_2007051732030582_cdda2f34e0 running on local with 4 rank(s).
Site information:
* Salvus version: 0.11.12
* Floating point size: 32


* Downloaded 341.1 KB of results to output_target.
* Total run time: 4.14 seconds.
* Pure simulation time: 3.72 seconds.
Job job_2007051732039265_034bc6b964 running on local with 4 rank(s).
Site information:
* Salvus version: 0.11.12
* Floating point size: 32


* Downloaded 341.4 KB of results to output_initial.
* Total run time: 6.07 seconds.
* Pure simulation time: 5.74 seconds.

# Load the data from both runs, and visualize.
#
# Note that this is a vectorial receiver and thus the component
# has to be specified.
target_event = j_target.get_as_event()
initial_event = j_initial.get_as_event()

initial_event.plot(receiver_field="displacement", component="X")

## Step 4: Data Processing

Data oftentimes has to be processed to be able to make a meaningful misfit measurement. You are free to perform any operation on the observed data - gradient computation/inversion does not care about that as it only sees the final result and operates no that. Synthetic data on the hand has to be processed carefully as structural gradients are partial derivatives of the misfit with respect to the model parameters. As the misfit is computed on the potentially processed synthetic data one has to account for that processing for the adjoint sources.

Salvus can do this fully automatically for temporal weights, e.g. a multiplication of each data sample with a given factor. If such a weight is specified Salvus will take care to apply it correctly to the synthetics as well as the final adjoint source.

The temporal weights enable classical processing steps like muting/damping of phases and windowing. At the SalvusFlow level it is specified as a function that is passed to the EventData objects.

import typing

import numpy as np
import obspy

from salvus.toolbox import utils

def compute_temporal_weights(
st: obspy.Stream,
sources: typing.List[sn.simple_config.source._Base],
) -> typing.Dict[str, typing.List[np.ndarray]]:
"""
Function computing temporal weights.

This example function will taper to zero after the first arrival
which is just computed geometrically in this case.

Args:
st: The input data, one trace per component. Don't
modify it!
sources: All sources for this event. Just one source
in many cases.

Returns:
A dictionary with a list of weight sets per component.
"""
# This project only has a single source per event.
# Does not yet account for the source time function.
distance = np.linalg.norm(
)

first_arrival = distance / 1000.0 - 0.1  # Account for STF
taper_length = 0.1

weights = {}
for tr in st:
all_weights_for_components = []
component = tr.stats.channel[-1]
times = tr.times() + tr.stats.starttime.timestamp

# Only a single window for each trace.
weights[component] = [
{
"values": utils.compute_window(
t=times,
window_center=first_arrival + 0.05,
window_width=0.2,
ramp_width=0.1,
)
}
]

return weights

# Can be hooked up to the event that will then process
# the data on the fly.
initial_event.register_temporal_weights_function(compute_temporal_weights)
target_event.register_temporal_weights_function(compute_temporal_weights)

# Now the processing is always applied for any subsequent action.
target_event.plot(receiver_field="displacement", component="X")