NHLIB Integration: Implement Disaggregation Calculator

Bug #1021642 reported by Damiano Monelli
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
OpenQuake (deprecated)
Fix Released
High
Lars Butler

Bug Description

With the implementation of the disaggregation calculator in NHLIB (see bug: https://bugs.launchpad.net/openquake/+bug/1007379) we can implement the disaggregation calculator in OpenQuake.

The configuration file for a `disaggregation' calculation should be something similar to the following:

[general]

calculation_mode = disaggregation
random_seed = 23

[geometry]

region = 6.5 45.8, 6.5 46.5, 8.5 46.5, 8.5 45.8
region_grid_spacing = 20.0

or
sites = ......

[logic_tree]

number_of_logic_tree_samples = 2

[erf]

rupture_mesh_spacing = 1
width_of_mfd_bin = 0.3
area_source_discretization = 10

[site_params]

reference_vs30_type = measured
reference_vs30_value = 760.0
reference_depth_to_2pt5km_per_sec = 5.0
reference_depth_to_1pt0km_per_sec = 100.0

or

site_model= ...

[calculation]

source_model_logic_tree_file = source_model_logic_tree.xml
gsim_logic_tree_file = gmpe_logic_tree.xml
investigation_time = 50.0
intensity_measure_types_and_levels = {"PGA": [0.005, 0.007, 0.0098, 0.0137, 0.0192, 0.0269, 0.0376, 0.0527, 0.0738, 0.103, 0.145, 0.203, 0.284, 0.397, 0.556], "SA(0.025)": [0.005, 0.007, 0.0098, 0.0137, 0.0192, 0.0269, 0.0376, 0.0527, 0.0738, 0.103, 0.145, 0.203, 0.284, 0.397, 0.556, 0.778, 1.09, 1.52, 2.13]}
truncation_level = 3
maximum_distance = 200.0
poes = 0.02 0.1 # these are the probabilities of exceedance for which disaggregation histograms are computed
# NOTE: Per discussion between Lars, Anton, and Marco P., the following parameters...
mag_bin_width = 0.3
distance_bin_width = 10.0
coordinate_bin_width = 0.02 # in decimal degrees
num_epsilon_bins = 4
# ... have replaced these parameters:
min_mag, max_mag, num_mags = 5.0, 6.5, 6
min_dist, max_dist, num_dists = 0.0, 100.0, 11
min_lat, max_lat, num_lats = -0.1, 0.1, 11
min_lon, max_lon, num_lons = -0.1, 0.1, 11
min_epsilon, max_epsilon, num_epsilon = -3.0, 3.0, 4

----------
Calculator Workflow
----------

The workflow for the event based calculator is as follows:

1) Given source model and gsim logic tree, n (=number_of_logic_tree_samples) logic tree paths are sampled and corresponding source models and gsims dictionaries (gsims = {'tectonic region type 1': gsim1, 'tectonic region type 2': gsim2, ...}) are computed.

2) For each tuple (source_model, gsim), the following calculations are done:

      2a) compute hazard curves for all the sites of interest

      2b) extract from all hazard curves, the ground motion values corresponding to the `poes' defined in the configuration file

      2c) compute disaggregation histograms, from all ground motion values, for all sites of interest

3) For each logic tree sample, save disaggregation histograms, for all sites, for all intensity measure types, for all poes.

4) Give option to serialize disaggregation histograms (and associated data, e.g. intensity measure type, period, probability of exceedance, time span, ground motion value, bin coordinates) to xml file.

The calculator should report to the user the status of the calculations (calculation done vs. calculation requested)

description: updated
description: updated
Changed in openquake:
assignee: nobody → Lars Butler (lars-butler)
status: New → Confirmed
importance: Undecided → Medium
importance: Medium → High
Revision history for this message
Lars Butler (lars-butler) wrote :

Comments from discussions with Dr. Monelli and Dr. Danciu:

Results:
- one XML result file per site per imt; there is no reason to group by sites
  - that means one uiapi.output record per set of matrices for a given site/imt
- it would be nice to have a `sites_file` parameter, defining a CSV file containing all of the site definitions; having 20 or 30 sites defined on one line is error prone and tedious (and hard to read)
- revised result XML prototype: http://pastebin.com/QdxtMTwF
  - note that the binary/XML hybrid has been dropped

Calculation:
- Phase 1: Compute hazard curves, using the Classical approach, parallelizing over sources
- Phase 2: When all curves are computed, compute disagg matrices, parallelizing over sites; we cannot parallelize over sources in this case

Changed in openquake:
status: Confirmed → In Progress
Changed in openquake:
milestone: none → 0.9.0
Changed in openquake:
status: In Progress → Fix Committed
Changed in openquake:
status: Fix Committed → Fix Released
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.