This documentation is not for the latest stable Salvus version.
import os
import typing
import numpy as np
import xarray as xr
import salvus.namespace as sn
from salvus.project.tools.processing.seismology import (
get_bandpass_filter_processing_function,
)
n_ranks = 2
resolved_period = 100.0
salvus_flow_site_name = os.environ.get("SITE_NAME", "local")
d = sn.domain.dim2.CircularDomain(radius_in_meter=6371e3)
# !rm -rf proj # Uncomment to restart from scratch.
p = sn.Project.from_domain(path="proj", domain=d, load_if_exists=True)
d.plot()
SideSet...
variant of a source here. Accurately locating sources and receivers with respect to deformed mesh surfaces is a nonlinear problem in general. The SideSet...
specification here instructs Salvus to solve an optimization problem before any simulations begin, to accurately locate points given a reference surface. Here the "r1"
specification implies that the surface of Earth is our coordinate reference.src = [
sn.simple_config.source.seismology.SideSetMomentTensorPoint2D(
mrr=1e21,
mpp=1e21,
mrp=-1e21,
longitude=45.0,
depth_in_m=10e3,
side_set_name="r1",
)
]
SideSet...
prefix applies here, and a depth of 0
means that the receivers will be placed exactly at the surface.lons = np.linspace(175.0, 185.0, 101)
recs = [
sn.simple_config.receiver.seismology.SideSetPoint2D(
longitude=l,
depth_in_m=0.0,
side_set_name="r1",
fields=["velocity"],
station_code=f"{_i:06d}",
)
for _i, l in enumerate(lons)
]
EventCollection
. Adding this to our project means that the data will be saved to disk, so we don't need to worry about accidentally overwriting objects we've previously created. Additionally, it allows to shut down the notebook, and come back later to pick up our work.p.add_to_project(sn.EventCollection.from_sources(sources=src, receivers=recs))
"event_0000"
, "event_0001"
, and so on.p.domain.plot(events=p.events.get("event_0000"))
xarray
package eases this process even further. xarray
natively supports reading models saved as CF-compliant NetCDF files, including many of those stored in the IRIS Earth Model Catalogue. Here we'll create a model from scratch, which parameterizes a Gaussian heterogeneity in units of percent perturbation. First, let's initialize the xarray
data structure. The model here extends 2000 km from the surface, and spans longitudes 170 to 190. We'll perturb "VP"
in this case, and initialize the perturbation array to .lons = np.linspace(170, 190, 101)
deps = np.linspace(0, 1000e3, 101)
ll, dd = np.meshgrid(lons, deps, indexing="ij")
raw_model = xr.Dataset(
coords={"longitude": lons, "depth": deps},
data_vars={
"VP": (["longitude", "depth"], np.zeros_like(ll), {"units": "%"})
},
attrs={
"geospatial_lon_units": "degrees_east",
"geospatial_vertical_units": "m",
},
)
def gaussian_2d(
m: xr.DataArray,
a: float,
c: typing.Tuple[float, float],
s: typing.Tuple[float, float],
) -> xr.DataArray:
"""
Return a 2-D gaussian.
:param m: The xarray model object. Requires longitude and
depth as coordinates.
:param a: Amplitude of the Gaussian.
:param c: Center of the gaussian (c_lon, c_rad).
:param s: Sigma of the gaussian (lon, dep).
"""
dep = m.depth
lon = m.longitude
return a * np.exp(
-(
np.abs(lon - c[0]) ** 2 / (2 * s[0] ** 2)
+ (np.abs(dep - c[1]) ** 2 / (2 * s[1] ** 2))
)
)
raw_model["VP"] += gaussian_2d(
raw_model.VP, 20, (ll.mean(), dd.mean()), (2.5, 100e3)
)
raw_model.VP.T.plot(figsize=(10, 5))
<matplotlib.collections.QuadMesh at 0x7f40da083210>
"MantleModel"
. This will transform the regularly-gridded xarray
dataset into Salvus' internal representation, ready to be used in a simulation.sm = sn.model.volume.seismology.MantleModel(name="2d_model", data=raw_model)
p.add_to_project(sm)
SimulationConfiuration
s: one through a 1-D PREM model, and one with our Gaussian blob interpolated as a relative perturbation. The primary difference between each SimulationConfiguration
below is the model we'll use. In the first, we pass None
for our volume model. In the second, we pass the model we defined above, or "2d_model"
. The short loop below then defines two distinct simulations, one with and one without the perturbation, and adds them to our project with the names "1d"
and "2d"
respectively. For a detailed overview of all the settings below, check out this video tutorial.for name, model in zip(["1d", "2d"], [None, "2d_model"]):
p.add_to_project(
sn.SimulationConfiguration(
name=name,
tensor_order=2,
max_depth_in_meters=6371e3,
elements_per_wavelength=1.0,
min_period_in_seconds=resolved_period,
model_configuration=sn.ModelConfiguration(
background_model=sn.model.background.one_dimensional.BuiltIn(
name="prem_iso_one_crust"
),
volume_models=model,
),
event_configuration=sn.EventConfiguration(
wavelet=sn.simple_config.stf.GaussianRate(
half_duration_in_seconds=2 * resolved_period
),
waveform_simulation_configuration=sn.WaveformSimulationConfiguration(
end_time_in_seconds=2000.0
),
),
)
)
"2d"
configuration -- look out for the source and receiver positions, as well as the perturbation we defined in our model. And caution -- the in-notebook mesh visualization is appropriate for small to medium size meshes. If you are working with high frequencies, we recommend you visualize the mesh in Paraview instead.if resolved_period > 10:
p.viz.nb.simulation_setup("2d")._repr_html_()
[2024-03-15 09:20:36,344] INFO: Creating mesh. Hang on.
Interpolating model: 2d_model
"1d"
and "2d"
) sequentially. SalvusFlow will handle all the necessary data transfer too and from the (local or remote) site, and will return when the data is ready for analysis.for name in ["1d", "2d"]:
p.simulations.launch(
events="event_0000",
ranks_per_job=n_ranks,
simulation_configuration=name,
site_name=salvus_flow_site_name,
)
p.simulations.query(block=True)
[2024-03-15 09:20:40,106] INFO: Creating mesh. Hang on.
[2024-03-15 09:20:41,499] INFO: Submitting job ... Uploading 1 files... 🚀 Submitted job_2403150920665386_e48502737a@local
[2024-03-15 09:20:48,827] INFO: Submitting job ... Uploading 1 files... 🚀 Submitted job_2403150920836817_34c3f0f457@local
for sim in ["1d", "2d"]:
p.add_to_project(
sn.processing.seismology.SeismologyProcessingConfiguration(
name=f"{sim}",
data_source_name=sim,
processing_function=get_bandpass_filter_processing_function(
min_frequency_in_hertz=1.0 / 100.0,
max_frequency_in_hertz=1.0 / (4 * resolved_period),
highpass_corners=4,
lowpass_corners=4,
),
)
)
"PROCESSED_DATA"
prefix to data gathering functions. Raw data can also be accessed by passing the name of the simulation configuration itself. Below we can compare our data in both simulated cases.p.viz.nb.waveforms(
data=["PROCESSED_DATA:1d", "PROCESSED_DATA:2d"],
receiver_field="velocity",
)
event_data_1d = p.waveforms.get("PROCESSED_DATA:1d", "event_0000")
event_data_2d = p.waveforms.get("PROCESSED_DATA:2d", "event_0000")
event_1d = event_data_1d.events[0]
rec_1d, data_1d = event_1d.receivers[0], event_1d.data
stream_1d = data_1d.get_receiver_data(rec_1d, "velocity", ["Z", "E"])
event_2d = event_data_2d.events[0]
rec_2d, data_2d = event_2d.receivers[0], event_2d.data
stream_2d = data_2d.get_receiver_data(rec_2d, "velocity", ["Z", "E"])
assert rec_1d.fields == ["velocity"]
assert stream_1d[0].data.max() > 4
assert stream_1d[1].data.max() > 3
assert rec_2d.fields == ["velocity"]
assert stream_2d[0].data.max() > 4
assert stream_2d[1].data.max() > 3
np.testing.assert_equal(
np.any(np.not_equal(stream_1d[0].data, stream_2d[0].data)), True
)
np.testing.assert_equal(
np.any(np.not_equal(stream_1d[1].data, stream_2d[1].data)), True
)
np.testing.assert_equal(
np.any(np.not_equal(stream_1d[0].data, stream_1d[1].data)), True
)