We need to be able to assess the size of an OpenQuake job prior to running it
Bug #797604 reported by
Muharem Hrnjadovic
This bug affects 1 person
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
OpenQuake (deprecated) |
Won't Fix
|
Medium
|
Unassigned |
Bug Description
This is needed in order to enforce quotas and/or make the decision whether virtual machines should be spun up in the cloud in order to perform the calculation.
We could have some sort of quick analysis phase that yields assessment data points like the number of grid points, assets etc. These are then fed into an assessment algorithm.
Changed in openquake: | |
status: | New → Confirmed |
importance: | Undecided → Medium |
tags: | added: performance quotas |
Changed in openquake: | |
milestone: | none → 0.4.3 |
Changed in openquake: | |
status: | Confirmed → In Progress |
status: | In Progress → Confirmed |
Changed in openquake: | |
assignee: | nobody → Lars Butler (lars-butler) |
Changed in openquake: | |
status: | Confirmed → In Progress |
Changed in openquake: | |
status: | In Progress → Confirmed |
Changed in openquake: | |
assignee: | Lars Butler (lars-butler) → nobody |
Changed in openquake: | |
milestone: | 0.4.3 → 0.4.4 |
Changed in openquake: | |
milestone: | 0.4.4 → 0.4.6 |
Changed in openquake: | |
milestone: | 0.4.6 → 0.5.0 |
Changed in openquake: | |
status: | Confirmed → Won't Fix |
To post a comment you must log in.
At the present time, estimating job time upfront (given all of the various parameters) is rather difficult. I recommend that we delay work on this particular feature until we have more historical job information from which we can gather data to create better estimates.