Skip to content

Commit 5f24620

Browse files
committed
warmstart doc
1 parent 4d969ee commit 5f24620

File tree

3 files changed

+102
-8
lines changed

3 files changed

+102
-8
lines changed

doc/source/tutorials.rst

Lines changed: 5 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -22,9 +22,6 @@ See `Manual ReHLine Formulation`_ documentation for more details and examples on
2222

2323
Moreover, the following specific classes of formulations can be directly solved by `ReHLine`.
2424

25-
- **Empirical Risk Minimization** (ERM) with various loss functions, see `ReHLine: Empirical Risk Minimization`_.
26-
- **Matrix Factorization** (MF) with with various loss functions, see `ReHLine: Matrix Factorization`_.
27-
2825
List of Tutorials
2926
=================
3027

@@ -40,14 +37,14 @@ List of Tutorials
4037
- | `ReHLine <./autoapi/rehline/index.html#rehline.ReHLine>`_
4138
- | ReHLine minimization with manual parameter settings.
4239

43-
* - `ReHLine: Ridge Composite Quantile Regression <./examples/CQR.ipynb>`_
44-
- | `CQR_Ridge <./autoapi/rehline/index.html#rehline.CQR_Ridge>`_
45-
- | Composite Quantile Regression (CQR) with a ridge penalty.
46-
4740
* - `ReHLine: Empirical Risk Minimization <./tutorials/ReHLine_ERM.rst>`_
4841
- | `plqERM_Ridge <./autoapi/rehline/index.html#rehline.plqERM_Ridge>`_
4942
- | Empirical Risk Minimization (ERM) with a piecewise linear-quadratic (PLQ) objective with a ridge penalty.
5043

44+
* - `ReHLine: Ridge Composite Quantile Regression <./examples/CQR.ipynb>`_
45+
- | `CQR_Ridge <./autoapi/rehline/index.html#rehline.CQR_Ridge>`_
46+
- | Composite Quantile Regression (CQR) with a ridge penalty.
47+
5148
* - `ReHLine: Matrix Factorization <./tutorials/ReHLine_MF.rst>`_
5249
- | `plqMF_Ridge <./autoapi/rehline/index.html#rehline.plqERM_Ridge>`_
5350
- | Matrix Factorization (MF) with a piecewise linear-quadratic (PLQ) objective with a ridge penalty.
@@ -60,4 +57,5 @@ List of Tutorials
6057
./tutorials/ReHLine_ERM
6158
./tutorials/loss
6259
./tutorials/constraint
60+
./tutorials/warmstart
6361

doc/source/tutorials/warmstart.rst

Lines changed: 94 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,94 @@
1+
Warm-Starting with ReHLine
2+
==========================
3+
4+
This tutorial explains how to use warm-starting with ReHLine, a Python library for regression with hinge loss, to enhance the efficiency of solving similar optimization problems.
5+
6+
Introduction
7+
------------
8+
9+
Warm-starting is a technique used to accelerate the convergence of optimization algorithms by initializing them with a solution from a previous run. This is particularly beneficial when you have a sequence of related problems to solve.
10+
11+
Setup
12+
-----
13+
14+
Before you begin, ensure you have the necessary packages installed. You need the `rehline` library, which is used for regression with hinge loss, and `numpy` for numerical operations. Install these packages using pip if you haven't already:
15+
16+
.. code-block:: bash
17+
18+
pip install rehline numpy
19+
20+
Simulating the Dataset
21+
----------------------
22+
23+
We first create a random dataset for classification:
24+
25+
.. code-block:: python
26+
27+
import numpy as np
28+
29+
n, d, C = 1000, 3, 0.5
30+
X = np.random.randn(n, d)
31+
beta0 = np.random.randn(d)
32+
y = np.sign(X.dot(beta0) + np.random.randn(n))
33+
34+
- **n** is the number of samples.
35+
- **d** is the number of features.
36+
- **C** is a regularization parameter.
37+
- **X** is the feature matrix.
38+
- **y** is the target vector, generated as a sign function of a linear combination of features plus some noise.
39+
40+
Using ReHLine Solver
41+
--------------------
42+
43+
The `ReHLine_solver` is tested first with a cold start and then with a warm start:
44+
45+
.. code-block:: python
46+
47+
from rehline._base import ReHLine_solver
48+
49+
U = -(C*y).reshape(1,-1)
50+
V = (C*np.array(np.ones(n))).reshape(1,-1)
51+
res = ReHLine_solver(X, U, V) # Cold start
52+
res_ws = ReHLine_solver(X, U, V, Lambda=res.Lambda) # Warm start
53+
54+
- **Cold Start**: The solver starts from scratch without any prior information.
55+
- **Warm Start**: The solver uses the solution from the cold start (`res.Lambda`) as the initial point for the next run.
56+
57+
Using ReHLine Class
58+
-------------------
59+
60+
The `ReHLine` class is used to fit a model:
61+
62+
.. code-block:: python
63+
64+
from rehline import ReHLine
65+
66+
clf = ReHLine(verbose=1)
67+
clf.C = C
68+
clf.U = -y.reshape(1,-1)
69+
clf.V = np.array(np.ones(n)).reshape(1,-1)
70+
clf.fit(X) # Cold start
71+
72+
clf.C = 2*C
73+
clf.warm_start = 1
74+
clf.fit(X) # Warm start
75+
76+
- **Cold Start**: The class is fitted with the initial data.
77+
- **Warm Start**: The class is fitted again with a different regularization parameter (`2*C`), using the previous solution as a starting point.
78+
79+
Using plqERM_Ridge
80+
------------------
81+
82+
Finally, the `plqERM_Ridge` class is tested similarly:
83+
84+
.. code-block:: python
85+
86+
from rehline import plqERM_Ridge
87+
88+
clf = plqERM_Ridge(loss={'name': 'svm'}, C=C, verbose=1)
89+
clf.fit(X=X, y=y) # Cold start
90+
91+
clf.C = 2*C
92+
clf.warm_start = 1
93+
clf.fit(X=X, y=y) # Warm start
94+

to-do.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,8 @@
11
# To-do list
22

33
## src
4-
- [ ] warmstarting GridSearchCV
4+
- [x] warmstarting
5+
- [ ] GridSearchCV
56

67
## Class
78
- [ ] sklearn Classifier and Regressor Estimator
@@ -10,6 +11,7 @@
1011
## Loss
1112
- [ ] MAE
1213
- [ ] TV
14+
- [ ] Hinge_squared
1315

1416
## Constraint
1517
- [ ] box constraints

0 commit comments

Comments
 (0)