import os
import pathlib
from frequency_band import FrequencyBand
PROJECT_DIR = "project_dir_central_europe"
SIMULATION_TIME_IN_SECONDS = 500.0
RANKS_PER_JOB = 4
SALVUS_FLOW_SITE_NAME = os.environ.get("SITE_NAME", "local")
fband_file = pathlib.Path("./frequency_band_70_120.pkl")
fband = FrequencyBand.load(fband_file)
fband
FrequencyBand(min_frequency_in_hertz=0.008333333333333333, max_frequency_in_hertz=0.014285714285714285)
from salvus import namespace as sn
p = sn.Project(path=PROJECT_DIR)
data
subdirectory. Before using the model later on for a simulation, we'll first have to add it to our project as below.p.add_to_project(
sn.model.volume.seismology.GenericModel(
name="s362_ani", data="data/s362ani.nc", parameters=["VSV", "VSH"]
)
)
GenericModel
function -- in contrast to the possible MantleModel
or CrustalModel
options. This is, again, due to the relatively limited resources present on the training cluster. While GenericModel
will interpolate the provided model without any consideration of radial discontinuities, the other two classes will ensure that interpolation only happens in the specified regions. In the standard use case we can use multiple 3-D models in a simulation, within and across target regions.UtmDomain
, but that is beyond the scope of this tutorial. We can register this topography model with the project as below.p.add_to_project(
sn.topography.spherical.SurfaceTopography(
name="surface_topography", data="data/topography.nc"
)
)
p.add_to_project(
sn.bathymetry.spherical.OceanLoad(
name="ocean_bathymetry", data="data/bathymetry.nc"
)
)
SimulationConfiguration
. It's a bit verbose, so let's go through the settings one by one! First, let's start with some fundamental ones.p.add_to_project(
sn.SimulationConfiguration(
name="my_first_simulation",
max_depth_in_meters=1750e3,
elements_per_wavelength=1.25,
min_period_in_seconds=fband.min_period_in_seconds,
model_configuration=sn.ModelConfiguration(
background_model="prem_ani_one_crust", volume_models=["s362_ani"]
),
topography_configuration=sn.TopographyConfiguration(
"surface_topography"
),
bathymetry_configuration=sn.BathymetryConfiguration(
"ocean_bathymetry"
),
event_configuration=sn.EventConfiguration(
wavelet=sn.simple_config.stf.GaussianRate(
half_duration_in_seconds=2 * fband.min_period_in_seconds
),
waveform_simulation_configuration=sn.WaveformSimulationConfiguration(
end_time_in_seconds=SIMULATION_TIME_IN_SECONDS,
attenuation=True,
),
),
absorbing_boundaries=sn.AbsorbingBoundaryParameters(
reference_velocity=3700.0,
number_of_wavelengths=0.0,
reference_frequency=fband.max_frequency_in_hertz,
),
)
)
p.viz.nb.simulation_setup("my_first_simulation", events=p.events.list())
[2024-11-15 13:55:36,222] INFO: Creating mesh. Hang on.
Interpolating model: s362_ani.
<salvus.flow.simple_config.simulation.waveform.Waveform object at 0x7183093a8f10>
site_name
parameter. If the site is remote, i.e. a supercomputing cluster, SalvusFlow will handle all the uploading and downloading of input files for you, as well as monitor the queue for job completion. You'll be informed when everything is done, and when the data is available for analysis.extra_output_configuration
. This allows us to easily request specialized outputs for one-off simulations, and in this case we'll use it to output snapshots of the wavefield propagating along the mesh's surface. We'll use this in the final cells to open and visualize in Paraview.p.simulations.launch(
events="event_CRETE_GREECE_Mag_6.15_2015-04-16-18-07",
simulation_configuration="my_first_simulation",
site_name=SALVUS_FLOW_SITE_NAME,
ranks_per_job=RANKS_PER_JOB,
extra_output_configuration={
"surface_data": {
"side_sets": ["r1"],
"sampling_interval_in_time_steps": 100,
"fields": ["velocity"],
}
},
)
[2024-11-15 13:55:39,958] INFO: Submitting job ... Uploading 1 files... 🚀 Submitted job_2411151355152677_5b77b11df1@local
1
query
command as below. Who will win? The block=True
parameter informs Salvus that we don't want to continue until our simulations are finished, as opposed to a more asynchronous mode of operation.p.simulations.query(block=True, ping_interval_in_seconds=5)
True
launch
and/or query
, we're invoking a call through SalvusProject to SalvusFlow. Before any computations happen, SalvusProject checks whether or not the relevant input files exist for a given simulations configuration, and creates them if they do not. The input files are then shipped over to the compute server (which may be local), and the job is run with SalvusCompute. Finally, everything is copied back to the host machine. All of this is done lazily, and if SalvusProject detects that this exact same configuration has been run before, it will simply return the already-computed data to you, rather than launching off another simulation. We'll see the benefits of this in the next few notebooks.print(
"Rsync (or scp) me!",
p.simulations.get_simulation_output_directory(
"my_first_simulation",
event="event_CRETE_GREECE_Mag_6.15_2015-04-16-18-07",
).absolute(),
)
Rsync (or scp) me! /tmp/tmpd_8tn8y4/project_dir_central_europe/EVENTS/event_CRETE_GREECE_Mag_6.15_2015-04-16-18-07/WAVEFORM_DATA/INTERNAL/34/6a/803cc38f496a