Skip to content

Commit 47080a5

Browse files
Anton BjörklundAggrathon
authored andcommitted
Add new citation
1 parent 32a9b70 commit 47080a5

File tree

2 files changed

+36
-11
lines changed

2 files changed

+36
-11
lines changed

CITATIONS.bib

Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,22 @@
1+
@article{bjorklund2022robust,
2+
title = {Robust regression via error tolerance},
3+
author = {Bj{\"o}rklund, Anton and Henelius, Andreas and Oikarinen, Emilia and Kallonen, Kimmo and Puolam{\"a}ki, Kai},
4+
year = {2022},
5+
month = jan,
6+
journal = {Data Mining and Knowledge Discovery},
7+
issn = {1384-5810, 1573-756X},
8+
doi = {10.1007/s10618-022-00819-2}
9+
}
10+
11+
@inproceedings{bjorklund2019sparse,
12+
title = {Sparse Robust Regression for Explaining Classifiers},
13+
booktitle = {Discovery Science},
14+
author = {Bj{\"o}rklund, Anton and Henelius, Andreas and Oikarinen, Emilia and Kallonen, Kimmo and Puolam{\"a}ki, Kai},
15+
year = {2019},
16+
series = {Lecture Notes in Computer Science},
17+
volume = {11828},
18+
pages = {351--366},
19+
publisher = {Springer International Publishing},
20+
doi = {10.1007/978-3-030-33778-0_27},
21+
isbn = {978-3-030-33777-3 978-3-030-33778-0}
22+
}

README.md

Lines changed: 14 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,20 @@
1-
![Example of Robust Regression](examples/pyslise_banner.png)
1+
![PySLISE Banner Image](examples/pyslise_banner.png)
22
# SLISE - Sparse Linear Subset Explanations
33

4-
Python implementation of the SLISE algorithm. The SLISE algorithm can be used for
5-
both robust regression and to explain outcomes from black box models.
6-
For more details see [the paper](https://rdcu.be/bVbda), alternatively for a more informal
7-
overview see [the presentation](https://github.com/edahelsinki/slise/raw/master/vignettes/presentation.pdf),
8-
or [the poster](https://github.com/edahelsinki/slise/raw/master/vignettes/poster.pdf).
4+
Python implementation of the SLISE algorithm. The SLISE algorithm can be used for both robust regression and to explain outcomes from black box models.
5+
For more details see [the original paper](https://rdcu.be/bVbda) or the [robust regression paper](https://rdcu.be/cFRHD).
6+
Alternatively for a more informal overview see [the presentation](https://github.com/edahelsinki/slise/raw/master/vignettes/presentation.pdf), or [the poster](https://github.com/edahelsinki/slise/raw/master/vignettes/poster.pdf).
97

10-
> **Björklund A., Henelius A., Oikarinen E., Kallonen K., Puolamäki K.**
11-
> *Sparse Robust Regression for Explaining Classifiers.*
8+
> *Björklund A., Henelius A., Oikarinen E., Kallonen K., Puolamäki K.* (2019)
9+
> **Sparse Robust Regression for Explaining Classifiers.**
1210
> Discovery Science (DS 2019).
1311
> Lecture Notes in Computer Science, vol 11828, Springer.
14-
> https://doi.org/10.1007/978-3-030-33778-0_27
12+
> https://doi.org/10.1007/978-3-030-33778-0_27
13+
14+
> *Björklund A., Henelius A., Oikarinen E., Kallonen K., Puolamäki K.* (2022).
15+
> **Robust regression via error tolerance.**
16+
> Data Mining and Knowledge Discovery.
17+
> https://doi.org/10.1007/s10618-022-00819-2
1518
1619
## The idea
1720

@@ -47,12 +50,12 @@ Here are two quick examples of SLISE in action. For more detailed examples, with
4750
> SLISE is a robust regression algorithm, which means that it is able to handle outliers. This is in contrast to, e.g., ordinary least-squares regression, which gives skewed results when outliers are present.
4851
4952
 
50-
> ![Example of Robust Regression](examples/ex2.png)
53+
> ![Example of Explanation](examples/ex2.png)
5154
> SLISE can also be used to explain outcomes from black box models by locally approximating the complex models with a simpler linear model.
5255
5356
## Dependencies
5457

55-
This implementation is requires Python 3 and the following packages:
58+
This implementation requires Python 3 and the following packages:
5659

5760
- matplotlib
5861
- numba

0 commit comments

Comments
 (0)