You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
*[XGBoost R Package Online Documentation](http://xgboost.readthedocs.org/en/latest/R-package/index.html)
10
+
*[XGBoost R Package Online Documentation](https://xgboost.readthedocs.org/en/stable/R-package/index.html)
11
11
- Check this out for detailed documents, examples and tutorials.
12
12
13
13
Installation
@@ -19,13 +19,7 @@ We are [on CRAN](https://cran.r-project.org/web/packages/xgboost/index.html) now
19
19
install.packages('xgboost')
20
20
```
21
21
22
-
For more detailed installation instructions, please see [here](http://xgboost.readthedocs.org/en/latest/build.html#r-package-installation).
23
-
24
-
Examples
25
-
--------
26
-
27
-
* Please visit [walk through example](demo).
28
-
* See also the [example scripts](../demo/kaggle-higgs) for Kaggle Higgs Challenge, including [speedtest script](../demo/kaggle-higgs/speedtest.R) on this dataset and the one related to [Otto challenge](../demo/kaggle-otto), including a [RMarkdown documentation](../demo/kaggle-otto/understandingXGBoostModel.Rmd).
22
+
For more detailed installation instructions, please see [here](https://xgboost.readthedocs.io/en/stable/install.html).
Copy file name to clipboardexpand all lines: R-package/vignettes/xgboost_introduction.Rmd
+9-6
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,10 @@ output:
12
12
toc_float: true
13
13
---
14
14
15
-
# Introduction
15
+
XGBoost for R introduction
16
+
==========================
17
+
18
+
## Introduction
16
19
17
20
**XGBoost** is an optimized distributed gradient boosting library designed to be highly **efficient**, **flexible** and **portable**. It implements machine learning algorithms under the [Gradient Boosting](https://en.wikipedia.org/wiki/Gradient_boosting) framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Hadoop, SGE, MPI) and can solve problems beyond billions of examples.
18
21
@@ -22,7 +25,7 @@ For more details about XGBoost's features and usage, see the [online documentati
22
25
23
26
This short vignette outlines the basic usage of the R interface for XGBoost, assuming the reader has some familiarity with the underlying concepts behind statistical modeling with gradient-boosted decision trees.
24
27
25
-
# Building a predictive model
28
+
##Building a predictive model
26
29
27
30
At its core, XGBoost consists of a C++ library which offers bindings for different programming languages, including R. The R package for XGBoost provides an idiomatic interface similar to those of other statistical modeling packages using and x/y design, as well as a lower-level interface that interacts more directly with the underlying core library and which is similar to those of other language bindings like Python, plus various helpers to interact with its model objects such as by plotting their feature importances or converting them to other formats.
28
31
@@ -62,7 +65,7 @@ model_abserr <- xgboost(x, y, objective = "reg:absoluteerror", nthreads = 1, nro
62
65
63
66
_Note: the objective must match with the type of the "y" response variable - for example, classification objectives for discrete choices require "factor" types, while regression models for real-valued data require "numeric" types._
64
67
65
-
# Model parameters
68
+
##Model parameters
66
69
67
70
XGBoost models allow a large degree of control over how they are built. By their nature, gradient-boosted decision tree ensembles are able to capture very complex patterns between features in the data and a response variable, which also means they can suffer from overfitting if not controlled appropirately.
68
71
@@ -105,7 +108,7 @@ xgboost(
105
108
)
106
109
```
107
110
108
-
# Examining model objects
111
+
##Examining model objects
109
112
110
113
XGBoost model objects for the most part consist of a pointer to a C++ object where most of the information is held and which is interfaced through the utility functions and methods in the package, but also contains some R attributes that can be retrieved (and new ones added) through `attributes()`:
111
114
@@ -131,7 +134,7 @@ xgb.importance(model)
131
134
xgb.model.dt.tree(model)
132
135
```
133
136
134
-
# Other features
137
+
##Other features
135
138
136
139
XGBoost supports many additional features on top of its traditional gradient-boosting framework, including, among others:
137
140
@@ -143,7 +146,7 @@ XGBoost supports many additional features on top of its traditional gradient-boo
143
146
144
147
See the [online documentation](https://xgboost.readthedocs.io/en/stable/index.html) - particularly the [tutorials section](https://xgboost.readthedocs.io/en/stable/tutorials/index.html) - for a glimpse over further functionalities that XGBoost offers.
145
148
146
-
# The low-level interface
149
+
##The low-level interface
147
150
148
151
In addition to the `xgboost(x, y, ...)` function, XGBoost also provides a lower-level interface for creating model objects through the function `xgb.train()`, which resembles the same `xgb.train` functions in other language bindings of XGBoost.
Copy file name to clipboardexpand all lines: doc/R-package/index.rst
+16-3
Original file line number
Diff line number
Diff line change
@@ -9,19 +9,32 @@ XGBoost R Package
9
9
10
10
You have found the XGBoost R Package!
11
11
12
+
.. toctree::
13
+
:maxdepth:2
14
+
:titlesonly:
15
+
12
16
***********
13
17
Get Started
14
18
***********
19
+
15
20
* Checkout the :doc:`Installation Guide </install>` contains instructions to install xgboost, and :doc:`Tutorials </tutorials/index>` for examples on how to use XGBoost for various tasks.
16
-
* Read the `API documentation <https://cran.r-project.org/web/packages/xgboost/xgboost.pdf>`_.
21
+
* Read the latest `API documentation <../r_docs/R-package/docs/reference/index.html>`__ . This might refer to a newer version than the one on CRAN.
22
+
* Read the `CRAN documentation <https://cran.r-project.org/web/packages/xgboost/xgboost.pdf>`_.
23
+
24
+
*********
25
+
Vignettes
26
+
*********
27
+
28
+
.. toctree::
29
+
30
+
xgboost_introduction
31
+
xgboostfromJSON
17
32
18
33
************
19
34
Other topics
20
35
************
21
36
22
37
.. toctree::
23
-
:maxdepth:2
24
-
:titlesonly:
25
38
26
39
Migrating code from previous XGBoost versions <migration_guide>
0 commit comments