Skip to content

Commit 1a8eb0b

Browse files
committed
Add example sections in docs
1 parent 87e6732 commit 1a8eb0b

2 files changed

Lines changed: 21 additions & 1 deletion

File tree

rampy/ml_exploration.py

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,8 @@
44
class mlexplorer:
55
"""use machine learning algorithms from scikit learn to explore spectroscopic datasets
66
7+
Performs automatic scaling and train/test split before NMF or PCA fit.
8+
79
Attributes
810
----------
911
x : {array-like, sparse matrix}, shape = (n_samples, n_features)
@@ -32,6 +34,21 @@ class mlexplorer:
3234
3335
Results for machine learning algorithms can vary from run to run. A way to solve that is to fix the random_state.
3436
37+
Example
38+
-------
39+
40+
Given an array X of n samples by m frequencies, and Y an array of n x 1 concentrations
41+
42+
>>> explo = rampy.mlexplorer(X) # X is an array of signals built by mixing two partial components
43+
>>> explo.algorithm = 'NMF' # using Non-Negative Matrix factorization
44+
>>> explo.nb_compo = 2 # number of components to use
45+
>>> explo.test_size = 0.3 # size of test set
46+
>>> explo.scaler = "MinMax" # scaler
47+
>>> explo.fit() # fitting!
48+
>>> W = explo.model.transform(explo.X_train_sc) # getting the mixture array
49+
>>> H = explo.X_scaler.inverse_transform(explo.model.components_) # components in the original space
50+
>>> plt.plot(X,H.T) # plot the two components
51+
3552
"""
3653

3754
def __init__(self,x,**kwargs):

rampy/ml_regressor.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -121,7 +121,10 @@ class mlregressor:
121121
Given an array X of n samples by m frequencies, and Y an array of n x 1 concentrations
122122
123123
>>> model = rampy.mlregressor(X,y)
124-
>>>
124+
>>> model.algorithm("SVM")
125+
>>> model.user_kernel = 'poly'
126+
>>> model.fit()
127+
>>> y_new = model.predict(X_new)
125128
126129
Remarks
127130
-------

0 commit comments

Comments
 (0)