Releases: gem/oq-engine
OpenQuake Engine 2.8.0
[Michele Simionato (@micheles)]
iml_disagg
andpoes_disagg
cannot coexist in the job.ini file- Added a check on
conditional_loss_poes
in the event_based_risk calculator:
now it requiresasset_loss_table
to be set
[Anirudh Rao (@raoanirudh)]
- Sorted taxonomies before comparison in the BCR calculator
[Michele Simionato (@micheles)]
- Optimized the disaggregation calculation by performing the PMF extraction
only once at the end of the calculation and not in the workers - Added an
oq zip
command - Avoided running an useless classical calculation if
iml_disagg
is given
[Valerio Poggi (@klunk386)]
- Implemented subclasses for ZhaoEtAl2006Asc and AtkinsonBoore2006 to account
for the distance filter used by SGS in their PSHA model for Saudi Arabia.
Distance threshold is hard coded to 5km in this implementation.
[Michele Simionato (@micheles)]
- Added a warning if the aggregated probability of exceedence (poe_agg) in
a disaggregation output is very dissimilar from poes_disagg - Removed the flag
split_sources
- Optimized the operation
arrange_data_in_bins
in the disaggregation
calculator and reduced the data transfer by the number of tectonic
region types - Improved the sending of the sources to the workers, especially for the
MultiPointSources - Better error message if the user sets a wrong site_id in the sites.csv file
- Now the distance and lon lat bins for disaggregation are built directly
from the integration distance - Used uniform bins for disaggregation (before they were potentially
different across realizations / source models) - Improved the error message if the user forgets both sites and sites.csv
in a calculation starting from predetermined GMFs - Improved the error message if the user specifies a non-existing file in
the job.ini - Change the ordering of the TRT bins in disaggregation: now they are
ordered lexicographically - Added more validity checks on the job.ini file for disaggregation
- Changed the .hdf5 format generated by
oq extract -1 hazard/rlzs
; you can
still produce the old format by usingoq extract -1 qgis-hazard/rlzs
- Optimized the disaggregation calculator by instantiating
scipy.stats.truncnorm
only once per task and not once per rupture - Optimized the disaggregation calculator when
iml_disagg
is specified,
by caching duplicated results - Made sure that
oq dbserver stop
also stops the zmq workers if the zmq
distribution is enabled - Added a check when disaggregating for a PoE too big for the source model
- If
iml_disagg
is given, forbidintensity_measure_types_and_levels
- Fixed the disaggregation outputs when
iml_disagg
is given: the PoE has
been removed by the name of the output and correct PoE is in the XML file - Fixed
oq export loss_curves/rlzs
for event_based_risk/case_master - Removed the obsolete parameter
loss_curve_resolution
- Fixed a Python 3 unicode error with
oq engine --run job.zip
- Added a command
oq abort <calc_id>
- Stored the avg_losses in classical risk in the same way as in
event_based_risk and made them exportable with the same format - Removed the outputs losses_by_tag from the event based risk calculators
and changed the default for the avg_losses flag to True - WebUI: now every job runs in its own process and has name oq-job-
- Refactored the WebUI tests to use the DbServer and django.test.Client
- Added an experimental feature
optimize_same_id_sources
- Fixed a bug in gmf_ebrisk when there are zero losses and added more
validity checks on the CSV file - The parameter
number_of_ground_motion_fields
is back to being optional in
scenario calculators reading the GMFs from a file, since it can be inferred - Removed the deprecated risk outputs dmg_by_tag, dmg_total,
losses_by_tag, losses_total - Deprecated the .geojson exporters for hazard curves and maps
- We now keep the realization weights in case of logic tree sampling (before
they were rescaled to 1 / R and considered all equals) - Optimized sampling for extra-large logic trees
- Added a check on missing
source_model_logic_tree
- Fix to the gmf_ebrisk calculator: the realization index in the event loss
table was incorrect and too many rows were saved - Added a way to restrict the source logic model tree by setting a sm_lt_path
variable in the job.ini (experimental) - Fixed the precedence order when reading openquake.cfd
OpenQuake Engine 2.7.0
[Michele Simionato (@micheles)]
- Fixed the risk exporters for tags containing non-ASCII characters
[Valerio Poggi (@klunk386)]
- Implemented the Pankow and Pechmann 2004 GMPE (not verified)
[Graeme Weatherill (@g-weatherill)]
- Added Derras et al. (2014) GMPE
- Implemented the Zhao et al. (2016) GMPE for active shallow crustal,
subduction interface and subduction slab events - Adds 'rvolc' (volcanic path distance) to the distance context
[Michele Simionato (@micheles)]
- The outputs loss_curves-rlzs and loss_curves-stats are now visible again
as engine outputs (before they were hidden) - Added a debug command
oq plot_assets
and fixedoq plot_sites
- Added a flag
--multipoint
to the commandoq upgrade_nrml
- Deprecated several outputs: hcurves-rlzs, agg_loss_table, losses_total,
dmg_by_tag, dmg_total, losses_by_tag, losses_by_tag-rlzs - Extended the command
oq extract job_id
to check the database - Optimized the scenario damage calculator by vectorizing the calculation
of the damage states - Extended the FragilityFunctions to accept sequences/arrays of intensity
levels, as requested by Hyeuk Ryu
[Daniele Viganò (@daniviga)]
- Added support for groups in the WebUI/API server
[Michele Simionato (@micheles)]
- Added an experimental distribution mode of kind "zmq"
- Implemented an API
/extract/agglosses/loss_type?tagname1=tagvalue1&...
with the ability to select all tagvalues (*
) for a given tagname - Deprecated reading hazard curves from CSV, since it was an experimental
features and nobody is using it - Changed the exporter of the event loss table to export all realizations
into a single file
[Graeme Weatherill (@g-weatherill)]
- Adds the Bindi et al. (2017) GMPEs for Joyner-Boore and Hypocentral
Distance
[Michele Simionato (@micheles)]
- Made it mandatory to specify the sites for all calculators reading the
GMFs from a CSV file - Tested also the case of calculators requiring a shared_dir
- Improved the error checking when parsing vulnerability functions with PMF
[Daniele Viganò (@daniviga)]
- Fixed a bug in
oq reset
command which was not stopping
the DbServer properly
[Michele Simionato (@micheles)]
- Implemented an API
/extract/aggcurves/loss_type?tagname1=tagvalue1&...
- Implemented an API
/extract/aggdamages/loss_type?tagname1=tagvalue1&...
- Every time a new calculation starts, we check if there is a newer version
of the engine available on GitHub - Changed the search logic for the configuration file
openquake.cfg
- Implemented an API
/extract/agglosses/loss_type?tagname1=tagvalue1&...
- Fixed several bugs in the gmf_ebrisk calculator, in particular in presence
of asset correlation and missing values on some sites - Fixed a bug with logging configured a WARN level instead of INFO level
if rtree was missing (affecting onlyoq run
) - Changed the ScenarioDamage demo to use two GSIMs
- Added a node
<tagNames>
in the exposure - Added a web API to extract the attributes of a datastore object
- Fixed
oq to_shapefile
andoq from_shapefile
to work with NRML 0.5
(except MultiPointSources) - Added information about the loss units to the
agg_curve
outputs oq extract hazard/rlzs
now extracts one realization at the time- The rupture filtering is now applied during disaggregation
- Changed the /extract wen API to return a compressed .npz file
- Fixed a bug with multi-realization disaggregation, celery and no
shared_dir: now the calculation does not hang anymore
OpenQuake Engine 2.6.0
[Michele Simionato (@micheles)]
- Fixed the GMF .npz export when the GMFs are extracted from a file
- Stored the number of nonzero losses per asset and realization in
event_based_risk calculations with asset_loss_table=True
[Daniele Viganò (@daniviga)]
- Fixed 'openquake' user creation in RPM when SELinux is in enforcing mode
- Changed the behaviour during RPM upgrades:
the old openquake.cfg configuration file is left untouched and the new one
installed as openquake.cfg.rpmnew
[Michele Simionato (@micheles)]
- Added a check on
number_of_ground_motion_fields
when the GMFs are
extracted from a NRML file - Added a command
oq extract
able to extract hazard outputs into HDF5 files - Fixed a bug when reading GMFs from a NRML file: the hazard sites were
read from the exposure (incorrectly) and not from the GMFs - Fixed a bug in MultiMFDs of kind
arbitraryMFD
[Valerio Poggi (@klunk386)]
- Implemented the Atkinson (2010) GMPE as subclass
Atkinson2010Hawaii
ofBooreAtkinson2008
[Michele Simionato (@micheles)]
- Used the new loss curves algorithm for the asset loss curves and loss maps
- Added a generic
extract
functionality to the web API - Fixed a bug when computing the rjb distances with multidimensional meshes
- Changed the GMF CSV exporter to export the sites too; unified it with the
event based one
[Daniele Viganò (@daniviga)]
- Changed the 'CTRL-C' behaviour to make sure that all children
processes are killed when a calculation in interrupted
[Michele Simionato (@micheles)]
- Fixed a bug in the statistical loss curves exporter for classical_risk
- Replaced the agg_curve outputs with losses by return period outputs
- Turned the DbServer into a multi-threaded server
- Used zmq in the DbServer
- Fixed correct_complex_sources.py
- Fixed
oq export hcurves-rlzs -e hdf5
- Changed the source weighting algorithm: now it is proportional to the
the number of affected sites - Added a command
oq show dupl_sources
and enhancesoq info job.ini
to display information about the duplicated sources - Added a flag
split_sources
in the job.ini (default False) - Updated the demos to the format NRML 0.5
[Valerio Poggi (@klunk386)]
- Implemented the Munson and Thurber 1997 (Volcanic) GMPE
[Graeme Weatherill (@g-weatherill)]
- Adapts CoeffsTable to be instantiated with dictionaries as well as strings
[Daniele Viganò (@daniviga)]
- Extended the 'oq reset' command to work on multi user installations
[Michele Simionato (@micheles)]
- Fixed a bug: if there are multiple realizations and no hazard stats,
it is an error to set hazard_maps=true or uniform_hazard_spectra=true - Implemented aggregation by asset tag in the risk calculators
- Fixed a small bug in the HMTK (in
get_depth_pmf
) - Extended the demo LogicTreeCase1ClassicalPSHA to two IMTs and points
- Added a documentation page
oq-commands.md
- Removed the automatic gunzip functionality and added an automatic
checksum functionality plus anoq checksum
command - Made the demo LogicTreeCase2ClassicalPSHA faster
- Fixed the export by realization of the hazard outputs
- Changed the generation of loss_maps in event based risk, without the option
--hc
: now it is done in parallel, except when reading the loss ratios - Renamed
--version-db
to--db-version
, to avoid
confusions betweenoq --version
andoq engine -version
- Fixed bug in the exported outputs: a calculation cannot export the results
of its parent - Extended the
sz
field in the rupture surface to 2 bytes, making it
possible to use a smaller mesh spacing - Changed the ordering of the fields in the loss curves and loss maps
generated by the event based risk calculator; now the insured fields
are at the end, before they were intermixed with each loss type - Changed the format of array
all_loss_ratios/indices
- The size in bytes of the GMFs was saved incorrectly
- Added an exporter gmf_scenario/rup-XXX working also for event based
- First version of the calculator gmf_ebrisk
- Implemented risk statistics for the classical_damage calculator
- Added a .csv importer for the ground motion fields
- Implemented risk statistics for the classical_bcr calculator
[Armando Scarpati (@hascar)]
- Show to the user the error message when deleting a calculation
in the WebUI fails
[Michele Simionato (@micheles)]
- Better error message when running a risk file in absence of hazard
calculation - Changed the sampling logic in event based calculators
- Imported GMFs from external file into the datastore
[Daniele Viganò (@daniviga)]
- Added the 'celery-status' script in 'utils' to check the
task distribution in a multi-node celery setup
[Michele Simionato (@micheles)]
- Removed an excessive check from the WebUI: now if an output exists,
it can be downloaded even if the calculation was not successful
[Armando Scarpati (@hascar)]
- Visualized the calculation_mode in the WebUI
[Michele Simionato (@micheles)]
- Made the upgrade_manager transactional again
- Changed the storage of the GMFs; as a consequence the exported .csv
has a different format
[Daniele Viganò (@daniviga)]
- Fixed a bug introduced by a change in Django 1.10 that was causing
the HTTP requests log to be caught by our logging system and
then saved in the DbServer - Updated requirements to allow installation of Django 1.11 (LTS)
[Michele Simionato (@micheles)]
- Added two commands
oq dump
andoq restore
- Added a check that on the number of intensity measure types when
generating uniform hazard spectra (must be > 1)
OpenQuake Engine 2.5.0
[Armando Scarpati (@hascar)]
- Added a confirmation dialog when trying to remove a calculation via the
WebUI
[Michele Simionato (@micheles)]
- Hazard maps were not exposed to the engine in event based calculations
- Fixed the check on the DbServer instance: it was failing in presence
of symbolic links - Optimized MultiMFD objects for the case of homogeneous parameters
- Added an .npz exporter for the scenario_damage output
dmg_by_asset
- Removed the pickled CompositeSourceModel from the datastore
- Improved the error message when the rupture mesh spacing is too small
- Unified the versions of baselib, hazardlib and engine
- Raised a clear error if the user does not set the
calculation_mode
- Made it is possible to pass the hdf5 full path to the DataStore class
- Made it possible to use CELERY_RESULT_BACKEND != 'rpc://'
[Michele Simionato (@micheles), Daniele Viganò (@daniviga)]
- Merged the
oq-hazardlib
repository intooq-engine
.
Thepython-oq-hazardlib
package is now provided bypython-oq-engine
[Michele Simionato (@micheles)]
- Added CSV exports for the agg_curve outputs
- Fixed a bug in
oq export hcurves-rlzs --exports hdf5
- Restored the parameter sites_per_tile with a default of 20,000, i.e.
tiling starts automatically if there are more than 20,000 sites - Better error message for invalid exposures
- Removed the deprecated XML outputs of the risk calculators
- Added an end point
v1/calc/XXX/oqparam
to extract the calculation
parameters as a JSON dictionary - Fixed the excessive logic tree reduction in event based calculators
- Improved the command
oq db
- Fixed an encoding bug when logging a filename with a non-ASCII character
- Fixed a bug when exporting a GMF with
ruptureId=0
- Added a parameter
disagg_outputs
to specify the kind of disaggregation
outputs to export - Raised an early error if the consequence model is missing some taxonomies
- Restored the tiling functionality in the classical calculator; to enable
it, setnum_tiles
in the job.ini file - If there are no statistical hazard curves to compute, do not transfer
anything - Fixed a small bug in
oq plot
and added a test
[Daniele Viganò (@daniviga)]
- Added
collectstatic
andcreatesuperuser
subcommands to the
oq webui
command - Added a
local_settings.py.pam
template to use PAM as the authentication
provider for API and WebUI - Now the command
oq webui start
tries to open a browser tab
with the WebUI loaded
OpenQuake Engine 2.4.0
[Michele Simionato (@micheles)]
- Now the command
oq export loss_curves/rlz-XXX
works both for the
classical_risk
calculator and theevent_based_risk
calculator
[Daniele Viganò (@daniviga)]
- Remove the default 30 day-old view limit in the WebUI calculation list
[Michele Simionato (@micheles)]
- Fixed a broken import affecting the command
oq upgrade_nrml
- Made it possible to specify multiple file names in
in the source_model_logic_tree file - Reduced the data transfer in the object
RlzsAssoc
and improved the
postprocessing of hazard curves when the option--hc
is given - Changed the
ruptures.xml
exporter to export unique ruptures - Fixed a bug when downloading the outputs from the WebUI on Windows
- Made
oq info --report
fast again by removing the rupture fine filtering - Improved the readibility of the CSV export
dmg_total
- Removed the column
eid
from the CSV exportruptures
; also
renamed the fieldserial
torup_id
and reordered the fields - Changed the event loss table exporter: now it exports an additional
column with therup_id
- Changed scenario npz export to export also the GMFs outside the maximum
distance - Fixed scenario npz export when there is a single event
- Replaced the event tags with numeric event IDs
- The mean hazard curves are now generated by default
- Improved the help message of the command
oq purge
- Added a
@reader
decorator to mark tasks reading directly from the
file system - Removed the .txt exporter for the GMFs, used internally in the tests
- Fixed a bug with relative costs which affected master for a long time,
but not the release 2.3. The insured losses were wrong in that case. - Added an .hdf5 exporter for the asset loss table
- Loss maps and aggregate losses are computed in parallel or sequentially
depending if the calculation is a postprocessing calculation or not - Deprecated the XML risk exporters
- Removed the .ext5 file
- Restored the parameter
asset_loss_table
in the event based calculators - Added a full .hdf5 exporter for
hcurves-rlzs
- Removed the
individual_curves
flag: now by default only the statistical
hazard outputs are exported - Saved a lot of memory in the computation of the hazard curves and stats
- Renamed the parameter
all_losses
toasset_loss_table
- Added an experimental version of the event based risk calculator which
is able to use GMFs imported from an external file - Added a
max_curve
functionality to compute the upper limit of the
hazard curves amongst realizations - Raised an error if the user specifies
quantile_loss_curves
orconditional_loss_poes
in a classical_damage calculation - Added a CSV exporter for the benefit-cost-ratio calculator
- The classical_risk calculator now reads directly the probability maps,
not the hazard curves - Turned the loss curves into on-demand outputs
for the event based risk calculator - The loss ratios are now stored in the datastore and not in an
external .ext5 file - The engine outputs are now streamed by the WebUI
- Used a temporary export directory in the tests, to avoid conflicts
in multiuser situations - Added an .npz exporter for the loss maps
- Raised an error early when using a complex logic tree in scenario
calculations - Changed the CSV exporter for the loss curves: now it exports all the
curves for a given site for the classical_risk calculator - Fixed the save_ruptures procedure when there are more than 256
surfaces in the MultiSurface - Renamed the
csq_
outputs of the scenario_damage tolosses_
- Changed the way scenario_damage are stored internally to be more
consistent with the other calculators - Removed the GSIM from the exported file name of the risk outputs
- New CSV exporter for GMFs generated by the event based calculator
- The event IDs are now unique and a constraint on the maximum
number of source groups (65,536) has been added - Added an output
losses_by_event
to the scenario_risk calculator - Changed the output
ruptures.csv
to avoid duplications - Added an output
losses_by_taxon
to the scenario_risk calculator - Fixed a performance bug in
get_gmfs
: now the scenario risk and damage
calculators are orders of magnitude faster for big arrays - Added an export test for the event loss table in the case of multiple TRTs
- Removed the experimental
rup_data
output - Added an .npz export for the output
losses_by_asset
- Exported the scenario_risk aggregate losses in a nicer format
[Daniele Viganò (@daniviga)]
- The 'oq webui' command now works on a multi-user installation
- Splitted RPM packages into python-oq-engine (single node) and
python-oq-engine-master/python-oq-engine-worker (multi-node)
[Paolo Tormene (@ptormene)]
- The 'Continue' button in the Web UI is now available also for risk
calculations
[Michele Simionato (@micheles)]
- Fixed a Python 3 bug in the WebUI when continuing a calculation: the
hazard_calculation_id was passed as a string and not as an integer - Changed to rupture storage to use variable length-arrays, with a speedup
of two orders of magnitude - Avoided storing twice the rupture events
- Optimized the serialization of ruptures on HDF5 by using a
sids
output - Changed the Web UI button from "Run Risk" to "Continue"
- The
avg
field in the loss curves is computed as the integral of the curve
again, and it is not extracted from the avg_losses output anymore - Made the
fullreport
exportable - Fixed the
rup_data
export, since the boundary field was broken - Restored the output
losses_by_taxon
in the event_based_risk calculator - Fixed the calculator event based UCERF so that average losses can
be stored
[Daniele Viganò (@daniviga)]
- Added a check to verify that an 'oq' client is talking to the
right DbServer instance - Introduced an optional argument for 'oq dbserver' command line
to be able to override its default interface binding behaviour
[Michele Simionato (@micheles)]
- Optimized the event based calculators by reducing the number of calls
to the GmfComputer and by using larger arrays - Added a check on missing vulnerability functions for some loss type
for some taxonomy - Now we save the GMFs on the .ext5 file, not the datastore
- Fixed bug in event_based_risk: it was impossible to use vulnerability
functions with "PM" distribution - Fixed bug in event_based_risk: the ebrisk calculator is required as
precalculator of event_based_risk, not others - Fixed bug in scenario_risk: the output
all_losses-rlzs
was aggregated
incorrectly - Now the ucerf_risk calculators transfer only the events, not the ruptures,
thus reducing the data transfer of several orders of magnitude - Added a view
get_available_gsims
to the WebUI and fixed the API docs - Introduced a configuration parameter
max_site_model_distance
with default
of 5 km - Implemented sampling in the UCERF event based hazard calculator
[Daniele Viganò (@daniviga)]
- Use threads instead of processes in DbServer because SQLite3
isn't fork-safe on macOS Sierra
[Michele Simionato (@micheles)]
- Fixed a TypeError when deleting a calculation from the WebUI
- Extended the command
oq to_hdf5
to manage source model files too - Improved significantly the performance of the event based calculator
when computing the GMFs and not the hazard curves - Stored information about the mean ground motion in the datastore
- Saved the rupture mesh with 32 floats instead of 64 bit floats
- Raised the limit on the event IDs from 2^16 to 2^32 per task
- Fixed classical_risk: there was an error when computing the statistics
in the case of multiple assets of the same taxonomy on the same site - Changed the UCERF event based calculators to parallelize by SES
- Fixed a site model bug: when the sites are extracted from the site model
there is no need to perform geospatial queries to get the parameters - Added a command
oq normalize
to produce goodsites.csv
files - Introduced a
ses_seed
parameter to specify the seed used to generate
the stochastic event sets;random_seed
is used for the sampling only - Changed the
build_rcurves
procedure to read the loss ratios directly from
the workers
OpenQuake Engine 2.3.0
[Michele Simionato (@micheles)]
oq info --report
now filters the ruptures and reports the correct
number of effective ruptures even for classical calculators- Stripped the TRT information from the event loss table CSV export
and optimized its performance - Fixed a bug when storing the GMPE logic tree file in the datastore
- Added a command
oq run_tiles
(experimental) - Fixed the event based calculator so that it can run UCERF ruptures
- Fixed a bug in the scenario_risk calculator in case of multiple assets
of the same taxonomy on the same site with no insurance losses - Now the event IDs are generated in the workers in the event based calculator
and there is a limit of 65536 tasks with 65536 ruptures each - Changed the UCERF classical calculators to compute one branch at the time
- Fixed the header
occupants:float32
in the CSV risk exports involving
occupants - Fixed the name of the zipped files downloaded by the Web UI: there
was a spurious dot - Fixed the UCERF classical calculator in the case of sampling
- Reduced the size of the event tags in the event based calculators, thus
saving GB of disk space in UCERF calculations - Fixed the name of the files downloaded by the Web UI: they must not
contain slashes - Now deleting a calculation from the Web UI really deletes it, before
if was only hiding it
[Daniele Viganò (@daniviga)]
- Moved the OpenQuake Engine manual sources inside doc/manual
[Michele Simionato (@micheles)]
- Introduced an experimental classical time dependent UCERF calculator
- Added a dynamic output for source group information
- Changed the UCERF rupture calculator to fully store the ruptures
- Fixed a bug in
combine_maps
: realizations with zero probability were
discarded, thus breaking the computation of the statistics - Added a command
oq reset
to reset database and datastores - Reduced the data transfer back and disk space occupation for UCERF
event based risk calculations - Tasks meant to be used with a shared directory are now marked with a
boolean attribute.shared_dir_on
- Added a warning when running event based risk calculations with sampling
- Made sure that the openquake.cfg file is read only once
[Daniele Viganò (@daniviga)]
- Moved the openquake.cfg config file inside the python package
under openquake/engine/openquake.cfg - Removed support to OQ_LOCAL_CFG_PATH and OQ_SITE_CFG_PATH vars;
only the OQ_CONFIG_FILE enviroment variable is read
[Michele Simionato (@micheles)]
- If there is a single realization, do not compute the statistics
- Changed the separator from comma to tab for the output
ruptures
- If there are no conditional_loss_poes, the engine does not try to
export the loss maps anymore - Fixed
oq engine --make-html-report
when using Python 3 - Fixed bug when running
oq info job.ini
with NRML 0.5 source models
OpenQuake Engine 2.2.0
[Michele Simionato (@micheles)]
- Fixed an HDF5 bug by not using a
vstr
array for the asset references - Fixed a wrong error message generated by
oq purge
- Added information about the rupture in the event loss table exports
- Fixed a bug and added a test calculation with nonparametric sources
- Fixed the classical UCERF calculator when there is more than one branch
- Added .npz exporter for gmf_data for event based calculations
[Daniele Viganò (@daniviga)]
- Port WebUI/API server to Django 1.9 and 1.10
- Add dependencies to setup.py
- Update Copyright to 2017
[Michele Simionato (@micheles)]
- Increased the splitting of ComplexFaultSources
- Added a way to reuse the CompositeSourceModel from a previous computation
- Turned the loss maps into dynamically generated outputs
- Extended the source model writer to serialize the attributes
src_interdep, rup_interdep, srcs_weights - Fixed a bug when exporting the uniform hazard spectra in presence of
IMTs non spectral acceleration - Fixed a bug when computing the loss maps in presence of insurance,
temporarily introduced in master - Made the datastore for event based risk calculations much lighter
by computing the statistical outputs at export time - Now it is possible to post process event based risk outputs with the
--hc
option - Added a command
oq to_hdf5
to convert .npz files into .hdf5 files - Moved commonlib.parallel into baselib
- Merged the experimental calculator ebrisk into event_based_risk and
used correctly the random_seed for generating the GMFs (not the master_seed) - Added a flag
ignore_covs
to ignore the coefficients of variation - Changed the GMF scenario exporter to avoid generating composite arrays with
a large number of fields - Exporting in .npz format rather than HDF5
- Introduced a
shared_dir
parameter in openquake.cfg - Fixed a serialization bug for planar surfaces
- Removed the flag
asset_loss_table
: the loss ratios are
saved if and only if theloss_ratios
dictionary is non-empty - Added a CSV exporter for the GMFs in the event based calculator
- Added a CSV exporter for the rup_data output
- Added a CSV exporter for the disaggregation output
- Stored the disaggregation matrices directly (no pickle)
- Turned the CompositeRiskModel into a HDF5-serializable object
- Fixed all doctests for Python 3
[Daniele Viganò (@daniviga)]
- Removed the 'oq-engine' wrapper (command already deprecated)
[Michele Simionato (@micheles)]
- Assigned a year label to each seismic event in the event based calculator
- Now the ebrisk calculator supports the case of asset_correlation=1 too
- Made it possible to export the losses generated by a specific event
- Lowered the limit on the length of source IDs to 60 chars
- Fixed excessive strictness when validating
consequenceFunction.id
- Added an
ucerf_rupture
calculator able to store seismic events and
rupture data and reduced the data transfer
[Daniele Viganò (@daniviga)]
- MANIFEST now includes all files, with any extension located in the
tests folders. It is now possible to run tests from an installation
made with packages
[Michele Simionato (@micheles)]
- Improved error message when the user gives a source model file instead of
a source model logic tree file - Fixed the management of negative calculation IDs
- Relaxed the tolerance so that the tests pass on Mac OS X
- Implemented csv exporter for the ruptures
- Optimized the epsilon generation in the ebrisk calculator for
asset_correlation=0 - Improved the performance of the scenario risk calculators
- Now by default we do not save the ruptures anymore
- Fixed a memory leak recently introduced in parallel.py
- Simplified classical_risk (the numbers can be slightly different now)
- Serialized the ruptures in the HDF5 properly (no pickle)
- Introduced a parameter
iml_disagg
in the disaggregation calculator - Fixed
oq reduce
to preserve the NRML version - Fixed a bug when splitting the fault sources by magnitude
OpenQuake Engine 2.1.1
[Michele Simionato (@micheles)]
- Fixed a bug when splitting the fault sources by magnitude
OpenQuake Engine 2.1.0
[Michele Simionato (@micheles)]
- There is now a flag
save_ruptures
that can be turned off on demand;
by default the ruptures are always saved in the event based calculators - Optimized the memory consumption when using a ProcessPoolExecutor (i.e
fork before reading the source model) by means of awakeup
task - Reduced the splitting of the fault sources
- Added a view
task_slowest
displaying info about the slowest task
(only for classical calculations for the moment) - concurrent_tasks=0 disable the concurrency
- Optimized the saving time of the GMFs
- Changed the default number of concurrent tasks and increased the
relative weight of point sources and area sources - Fixed the UCERF event loss table export and added a test for it
- Optimized the computation of the event loss table
- Introduced two new calculators ucerf_risk and ucerf_risk_fast
[Paolo Tormene (@ptormene)]
- Added to the engine server the possibility to log in and out
programmatically by means of HTTP POST requests
[Michele Simionato (@micheles)]
- Optimized the memory consumption of the event based risk calculators
- Extended the
oq show
command to work in a multi-user environment - Improved the test coverage of the exports in the WebUI
- Removed the SourceManager: now the sources are filtered in the workers
and we do not split in tiles anymore - Made the full datastore downloadable from the WebUI
- Added a command "oq db" to send commands the engine database
(for internal usage) - By default the WebUI now displays only the last 100 calculations
- Added more validity checks to the disaggregation parameters; split the
sources even in the disaggregation phase - Added an optimized event based calculator computing the total losses by
taxonomy and nothing else - Filtered the sources up front when there are few sites (<= 10)
- Reduced the number of tasks generated when filter_sources is False
- Saved engine_version and hazardlib_version as attributes of the datastore
- Avoided saving the ruptures when ground_motion_fields is True
- Finalized the HDF5 export for hazard curves, hazard maps and uniform
hazard spectra - Restored a weight of 1 for each rupture in the event based calculator
- Removed the MultiHazardCurveXMLWriter
- Improved the saving of the ruptures in event based calculations
- Reduced the data transfer due to the
rlzs_by_gsim
parameter - Added an HDF5 export for scenario GMFs
- If
filter_sources
if false, the light sources are not filtered, but the
heavy sources are always filtered - Now the dbserver can be stopped correctly with CTRL-C
- Parallelized the splitting of heavy sources
- Changed the event loss table exporter: now a single file per realization
is exported, containing all the loss types - Removed the dependency from the Django ORM
- Now the WebUI restarts the ProcessPoolExecutor at the end of each job,
to conserve resources - Optimized the computation of hazard curves and statistics, especially
for the memory consumption - Reduced the data transfer due to the
rlzs_assoc
andoqparam
objects - Fixed a bug in the disaggregation calculator when a source group has
been filtered away by the maximum distance criterium - Fixed an encoding error in the reports when the description contains a
non-ASCII character - Changed the distribution framework: celery is supported in a way more
consistent with the other approaches; moreover, ipyparallel is supported - Hazard maps are now a fake output, dynamically generated at export time
- Made the number of produced tasks proportional to the number of tiles
- Raised an error for event_based_risk producing no GMFs
- Added a view for the slow sources
- Transmitted the attributes of a SourceGroup to the underlying sources
- Fixed the names of exported files for hazard maps in .geojson format
- Added an header with metadata to the exported hazard curves and maps
- Avoid storing filtered-away probability maps, thus fixing a bug
- Restored the precalculation consistency check that was disabled during the
transition to engine 2.0 - Fixed a bug with
oq engine --delete-calculation
- Hazard curves/maps/uniform spectra can now be recomputed
- Restored the early check on missing taxonomies
- Raise an early error if an user forget the
rupture_mesh_spacing
parameter - Fixed a bug while deleting jobs from the db in Ubuntu 12.04
- Ported the shapefile converter from the nrml_converters
- Added source model information in the file
realizations.csv
oq engine --run job.ini --exports csv
now also exports the realizations- Introduced the format NRML 0.5 for source models
- Added a check on the version in case of export errors
- Extended
oq purge
to remove calculations from the database too - Fixed
--make-html-report
: the view task_info was not registered - Stored several strings as HDF5-variable-length strings
- Fixed an export bug for the hazard curves in .geojson format
- Removed the array cost_types from the datastore
- Taxonomies with chars not in the range a-z0-9 were incorrectly rejected
- Improved the XML parsing utilities in speed, memory, portability and
easy of use - Forbidden the reuse of exposure because is was fragile and error prone
- Fixed a bug with the
realizations
array, which in hazard calculations
was empty in the datastore
OpenQuake Engine 2.0.1
[Michele Simionato (@micheles)]
- Fixed a bug for tectonic region types filtered away