Check if a hazard calculation is too big before running it
Bug #1358621 reported by
Michele Simionato
This bug affects 1 person
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
OpenQuake Engine |
Fix Released
|
High
|
Michele Simionato |
Bug Description
This is important for OATS-like services. There should be no limits for calculation on our cluster.
The idea is to introduce two parameters in openquake.cfg:
# maximum weight of the sources; 0 means no limit
# for a laptop, a good number is 200,000
max_input_weight = 0
# maximum size of the output in some units; 0 means no limit
# for a laptop, a good number is 2,000,000
max_output_weight = 0
If the limits are set, an user trying to run a computation over the limits (because the input is too large or because the expected output is too large) will get an error right after the pre_execute phase, before the real computation starts.
Changed in oq-engine: | |
importance: | Undecided → High |
status: | New → In Progress |
assignee: | nobody → Michele Simionato (michele-simionato) |
milestone: | none → 1.0.1 |
Changed in oq-engine: | |
status: | In Progress → Fix Committed |
Changed in oq-engine: | |
status: | Fix Committed → Fix Released |
To post a comment you must log in.
With the proposed implementation the input weight is given by the number of ruptures generated by the sources; for point sources however a corrective factor given by the parameter `point_ source_ weight` (currently 1/40) is applied.
The output weight is a pure number which is proportional to the size of the expected output of the calculator. For classical and disaggregation calculators it is given by n_sites * n_realizations * n_levels; for the event based calculator is given by n_sites * n_realizations .