Skip to content

Commit ca483b5

Browse files
committed
update documentation link and prepare release tag
1 parent 0c209f9 commit ca483b5

File tree

3 files changed

+3
-7
lines changed

3 files changed

+3
-7
lines changed

README.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ pip install scoringrules
4545

4646
## Documentation
4747

48-
Learn more about `scoringrules` in its official documentation at https://frazane.github.io/scoringrules/.
48+
Learn more about `scoringrules` in its official documentation at https://scoringrules.readthedocs.io/en/latest/.
4949

5050

5151
## Quick example
@@ -77,5 +77,3 @@ grateful for fruitful discussions with the authors.
7777
- The quality of this library has also benefited a lot from discussions with (and contributions from)
7878
Nick Loveday and Tennessee Leeuwenburg, whose python library [`scores`](https://github.com/nci/scores)
7979
similarly provides a comprehensive collection of forecast evaluation methods.
80-
- The implementation of the ensemble-based metrics as jit-compiled numpy generalized `ufuncs`
81-
was first proposed in [`properscoring`](https://github.com/properscoring/properscoring), released under Apache License, Version 2.0.

docs/index.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -65,7 +65,7 @@ pip install scoringrules
6565

6666
## Documentation
6767

68-
Learn more about `scoringrules` in its official documentation at https://frazane.github.io/scoringrules/.
68+
Learn more about `scoringrules` in its official documentation at https://scoringrules.readthedocs.io/en/latest/.
6969

7070

7171
## Quick example
@@ -97,5 +97,3 @@ grateful for fruitful discussions with the authors.
9797
- The quality of this library has also benefited a lot from discussions with (and contributions from)
9898
Nick Loveday and Tennessee Leeuwenburg, whose python library [`scores`](https://github.com/nci/scores)
9999
similarly provides a comprehensive collection of forecast evaluation methods.
100-
- The implementation of the ensemble-based metrics as jit-compiled numpy generalized `ufuncs`
101-
was first proposed in [`properscoring`](https://github.com/properscoring/properscoring), released under Apache License, Version 2.0.

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ build-backend = "hatchling.build"
44

55
[project]
66
name = "scoringrules"
7-
version = "0.7.1"
7+
version = "0.8.0"
88
description = "Scoring rules for probabilistic forecast evaluation."
99
readme = "README.md"
1010
requires-python = ">=3.10"

0 commit comments

Comments
 (0)