This documentation is not for the latest stable Salvus version.

This tutorial is presented as Python code running inside a Jupyter Notebook, the recommended way to use Salvus. To run it yourself you can copy/type each individual cell or directly download the full notebook, including all required files.

Manual Gradient Computation

This notebook teaches how to compute gradients with respect to a mesh's material parameters using only SalvusFlow and not SalvusProject. SalvusProject internally handles many of the complexities but the approach presented here grants a bit more flexibility.

# Variable used in the notebook to determine which site
# is used to run the simulations.
import os
SALVUS_FLOW_SITE_NAME = os.environ.get("SITE_NAME", "local")
%matplotlib inline

import pathlib

import numpy as np
import matplotlib.pyplot as plt
import xarray as xr

import salvus.namespace as sn

Step 1: Generate target and initial model

This example is purely synthetic so we'll generate a target model with a Gaussian blob perturbation in vsv_s in the center of the domain. This will be used to generate the "observed data" in this case. The initial model (the synthetics for cause of this tutorial) is a homogeneous model without the blob.

# Use xarray to define the blob in vs.
def get_target_model():
    x = np.linspace(-500, 500, 200)
    y = np.linspace(-500, 500, 200)
    xx, yy = np.meshgrid(x, y, indexing="ij")

    # Simple gaussian blob around the center of the domain.
    d = (xx ** 2 + yy ** 2) ** 0.5
    vs = np.exp(-0.5 * (d / 100.0) ** 2) * 250.0 + 1000.0

    ds = xr.Dataset(
        data_vars={"vs": (["x", "y"], vs)}, coords={"x": x, "y": y},

    return ds

target_model = get_target_model()
<matplotlib.collections.QuadMesh at 0x7fef8533d750>

Create a mesh - this is directly used as the initial model.

# Build mesh for the initial model.
initial_model_mesh = sn.simple_mesh.CartesianHomogeneousIsotropicElastic2D(

# Shift.
initial_model_mesh.points[:, 0] +=
initial_model_mesh.points[:, 1] +=

Now interpolate the blob onto it thus creating the target mesh. Have a look at the perturbation in vsv_s.

target_model_mesh = initial_model_mesh.copy()

target_model_mesh = sn.toolbox.interpolate_cartesian(
    data=target_model, mesh=target_model_mesh, extrapolate=True
When run interactively in a Jupyter Notebook, this output cell contains a widget. Click the button below to load a preview of it. Please note that most interactive functionality does not work without a running Python kernel.
<salvus.mesh.unstructured_mesh.UnstructuredMesh at 0x7fef85052a50>

This is also the starting point for setting up a project for the inversion. We can directly create the project from the model defined above. We will call this model the true_model.

# Explosive source in one corner of the mesh.
src = sn.simple_config.source.cartesian.MomentTensorPoint2D(

# A centered ring of receivers.
recs = sn.simple_config.receiver.cartesian.collections.RingPoint2D(
    x=0, y=0, radius=200.0, count=50, fields=["displacement"]

# Create two simulation objects.
# (1) The target will serve as observed data.
w_target = sn.simple_config.simulation.Waveform(
    mesh=target_model_mesh, sources=src, receivers=recs
w_target.physics.wave_equation.end_time_in_seconds = 1.0

# (2) The initial model will be used to compute synthetics.
w_initial = sn.simple_config.simulation.Waveform(
    mesh=initial_model_mesh, sources=src, receivers=recs
w_initial.physics.wave_equation.end_time_in_seconds = 1.0

# Must store the checkpoints for the subsequent adjoint run.
w_initial.output.volume_data.format = "hdf5"
w_initial.output.volume_data.filename = "output.h5"
w_initial.output.volume_data.fields = ["adjoint-checkpoint"]
w_initial.output.volume_data.sampling_interval_in_time_steps = (

For the target/observed model and the initial/current model.

# Create the "observed data".
j_target =

# Forward run for the synthetics.
j_initial =
    # Don't delete the remote files as they are needed
    # for the adjoint run!
Job `job_2007161447435956_9e11a461b6` running on `local` with 4 rank(s).
Site information:
  * Salvus version: 0.11.13
  * Floating point size: 32
* Downloaded 341.1 KB of results to `output_target`.
* Total run time: 2.02 seconds.
* Pure simulation time: 1.47 seconds.
Job `job_2007161447324380_22acf2e5a1` running on `local` with 4 rank(s).
Site information:
  * Salvus version: 0.11.13
  * Floating point size: 32
-> Current Task: Time loop complete* Downloaded 341.4 KB of results to `output_initial`.
* Total run time: 5.16 seconds.
* Pure simulation time: 4.97 seconds.
# Load the data from both runs, and visualize.
# Note that this is a vectorial receiver and thus the component
# has to be specified.
target_event = j_target.get_as_event()
initial_event = j_initial.get_as_event()

target_event.plot(receiver_field="displacement", component="X")
initial_event.plot(receiver_field="displacement", component="X")

Data oftentimes has to be processed to be able to make a meaningful misfit measurement. You are free to perform any operation on the observed data - gradient computation/inversion does not care about that as it only sees the final result and operates no that. Synthetic data on the hand has to be processed carefully as structural gradients are partial derivatives of the misfit with respect to the model parameters. As the misfit is computed on the potentially processed synthetic data one has to account for that processing for the adjoint sources.

Salvus can do this fully automatically for temporal weights, e.g. a multiplication of each data sample with a given factor. If such a weight is specified Salvus will take care to apply it correctly to the synthetics as well as the final adjoint source.

The temporal weights enable classical processing steps like muting/damping of phases and windowing. At the SalvusFlow level it is specified as a function that is passed to the EventData objects.

import typing

import numpy as np
import obspy

from import compute_window

def compute_temporal_weights(
    st: obspy.Stream,
    receiver: sn.simple_config.receiver._Base,
    sources: typing.List[sn.simple_config.source._Base],
) -> typing.Dict[str, typing.List[np.ndarray]]:
    Function computing temporal weights.

    This example function will taper to zero after the first arrival
    which is just computed geometrically in this case.

        st: The input data, one trace per component. Don't
            modify it!
        receiver: The receiver for the data.
        sources: All sources for this event. Just one source
            in many cases.

        A dictionary with a list of weight sets per component.
    # This project only has a single source per event.
    # Does not yet account for the source time function.
    distance = np.linalg.norm(
        np.array(receiver.location) - np.array(sources[0].location)

    first_arrival = distance / 1000.0 - 0.1  # Account for STF
    taper_length = 0.1

    weights = {}
    for tr in st:
        all_weights_for_components = []
        component =[-1]
        times = tr.times() + tr.stats.starttime.timestamp

        # Only a single window for each trace.
        weights[component] = [
                "values": compute_window(
                    window_center=first_arrival + 0.05,

    return weights

# Can be hooked up to the event that will then process
# the data on the fly.

# Now the processing is always applied for any subsequent action.
initial_event.plot(receiver_field="displacement", component="X")
target_event.plot(receiver_field="displacement", component="X")