Skip to content

Commit 5bbabb7

Browse files
committed
docs: user: Add information about Vizier in AutoTuner
Signed-off-by: Eryk Szpotanski <[email protected]>
1 parent 20aba1a commit 5bbabb7

File tree

1 file changed

+68
-18
lines changed

1 file changed

+68
-18
lines changed

docs/user/InstructionsForAutoTuner.md

Lines changed: 68 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -5,20 +5,29 @@ AutoTuner provides a generic interface where users can define parameter configur
55
This enables AutoTuner to easily support various tools and flows. AutoTuner also utilizes [METRICS2.1](https://github.com/ieee-ceda-datc/datc-rdf-Metrics4ML) to capture PPA
66
of individual search trials. With the abundant features of METRICS2.1, users can explore various reward functions that steer the flow autotuning to different PPA goals.
77

8-
AutoTuner provides two main functionalities as follows.
9-
* Automatic hyperparameter tuning framework for OpenROAD-flow-script (ORFS)
10-
* Parametric sweeping experiments for ORFS
8+
AutoTuner provides three main functionalities as follows.
9+
* [Ray] Automatic hyperparameter tuning framework for OpenROAD-flow-script (ORFS)
10+
* [Ray] Parametric sweeping experiments for ORFS
11+
* [Vizier] Multi-objective optimization of ORFS parameters
1112

1213

1314
AutoTuner contains top-level Python script for ORFS, each of which implements a different search algorithm. Current supported search algorithms are as follows.
14-
* Random/Grid Search
15-
* Population Based Training ([PBT](https://www.deepmind.com/blog/population-based-training-of-neural-networks))
16-
* Tree Parzen Estimator ([HyperOpt](https://hyperopt.github.io/hyperopt))
17-
* Bayesian + Multi-Armed Bandit ([AxSearch](https://ax.dev/))
18-
* Tree Parzen Estimator + Covariance Matrix Adaptation Evolution Strategy ([Optuna](https://optuna.org/))
19-
* Evolutionary Algorithm ([Nevergrad](https://github.com/facebookresearch/nevergrad))
15+
* Ray (Single-objective optimization)
16+
* Random/Grid Search
17+
* Population Based Training ([PBT](https://www.deepmind.com/blog/population-based-training-of-neural-networks))
18+
* Tree Parzen Estimator ([HyperOpt](https://hyperopt.github.io/hyperopt))
19+
* Bayesian + Multi-Armed Bandit ([AxSearch](https://ax.dev/docs/bayesopt.html))
20+
* Tree Parzen Estimator + Covariance Matrix Adaptation Evolution Strategy ([Optuna](https://optuna.readthedocs.io/en/stable/reference/samplers/generated/optuna.samplers.TPESampler.html))
21+
* Evolutionary Algorithm ([Nevergrad](https://github.com/facebookresearch/nevergrad))
22+
* Vizier (Multi-objective optimization)
23+
* Random/Grid/Shuffled Search
24+
* Quasi Random Search ([quasi-random](https://developers.google.com/machine-learning/guides/deep-learning-tuning-playbook/quasi-random-search))
25+
* Gaussian Process Bandit ([GP-Bandit](https://acsweb.ucsd.edu/~shshekha/GPBandits.html))
26+
* Non-dominated Sorting Genetic Algorithm II ([NSGA-II](https://ieeexplore.ieee.org/document/996017))
2027

21-
User-defined coefficient values (`coeff_perform`, `coeff_power`, `coeff_area`) of three objectives to set the direction of tuning are written in the script. Each coefficient is expressed as a global variable at the `get_ppa` function in `PPAImprov` class in the script (`coeff_perform`, `coeff_power`, `coeff_area`). Efforts to optimize each of the objectives are proportional to the specified coefficients.
28+
For Ray algorithms, optimized function can be adjusted with user-defined coefficient values (`coeff_perform`, `coeff_power`, `coeff_area`) for three objectives to set the direction of tuning. They are defined in the [distributed.py sricpt](../../tools/AutoTuner/src/autotuner/distributed.py) in `get_ppa` method of `PPAImprov` class. Efforts to optimize each of the objectives are proportional to the specified coefficients.
29+
30+
Using Vizier algorithms, used can choose which metrics should be optimized with `--use-metrics` argument.
2231

2332

2433
## Setting up AutoTuner
@@ -28,8 +37,10 @@ that works in Python3.8 for installation and configuration of AutoTuner,
2837
as shown below:
2938

3039
```shell
31-
# Install prerequisites
40+
# Install prerequisites for both Ray Tune and Vizier
3241
./tools/AutoTuner/installer.sh
42+
# Or install prerequisites for `ray` or `vizier`
43+
./tools/AutoTuner/installer.sh vizier
3344

3445
# Start virtual environment
3546
./tools/AutoTuner/setup.sh
@@ -50,7 +61,8 @@ Alternatively, here is a minimal example to get started:
5061
1.0,
5162
3.7439
5263
],
53-
"step": 0
64+
"step": 0,
65+
"scale": "log"
5466
},
5567
"CORE_MARGIN": {
5668
"type": "int",
@@ -67,6 +79,7 @@ Alternatively, here is a minimal example to get started:
6779
* `"type"`: Parameter type ("float" or "int") for sweeping/tuning
6880
* `"minmax"`: Min-to-max range for sweeping/tuning. The unit follows the default value of each technology std cell library.
6981
* `"step"`: Parameter step within the minmax range. Step 0 for type "float" means continuous step for sweeping/tuning. Step 0 for type "int" means the constant parameter.
82+
* `"scale"`: Vizier-specific parameter setting [scaling type](https://oss-vizier.readthedocs.io/en/latest/guides/user/search_spaces.html#scaling), allowed values: `linear`, `log` and `rlog`.
7083

7184
## Tunable / sweepable parameters
7285

@@ -118,13 +131,21 @@ The order of the parameters matter. Arguments `--design`, `--platform` and
118131
`--config` are always required and should precede *mode*.
119132
```
120133

134+
The `autotuner.vizier` module integrates OpenROAD flow into the Vizier optimizer.
135+
It is used for multi-objective optimization with an additional features improving chance of finding valid parameters.
136+
Moreover, various algorithms are available for tuning parameters.
137+
138+
Each mode relies on user-specified search space that is
139+
defined by a `.json` file, they use the same syntax and format,
140+
though some features may not be available for sweeping.
141+
121142
```{note}
122143
The following commands should be run from `./tools/AutoTuner`.
123144
```
124145

125146
#### Tune only
126147

127-
* AutoTuner: `python3 -m autotuner.distributed tune -h`
148+
* Ray-based AutoTuner: `python3 -m autotuner.distributed tune -h`
128149

129150
Example:
130151

@@ -145,19 +166,37 @@ python3 -m autotuner.distributed --design gcd --platform sky130hd \
145166
sweep
146167
```
147168

169+
#### Multi-object optimization
170+
171+
* Vizier-based AutoTuner: `python3 -m autotuner.vizier -h`
172+
173+
Example:
174+
175+
```shell
176+
python3 -m autotuner.vizier --design gcd --platform sky130hd \
177+
--config ../../flow/designs/sky130hd/gcd/autotuner.json
178+
```
148179

149180
### Google Cloud Platform (GCP) distribution with Ray
150181

151182
GCP Setup Tutorial coming soon.
152183

153184

154-
### List of input arguments
185+
### List of common input arguments
155186
| Argument | Description |
156187
|-------------------------------|-------------------------------------------------------------------------------------------------------|
157188
| `--design` | Name of the design for Autotuning. |
158189
| `--platform` | Name of the platform for Autotuning. |
159190
| `--config` | Configuration file that sets which knobs to use for Autotuning. |
160191
| `--experiment` | Experiment name. This parameter is used to prefix the FLOW_VARIANT and to set the Ray log destination.|
192+
| `--algorithm` | Search algorithm to use for Autotuning. |
193+
| `--openroad_threads` | Max number of threads usable. |
194+
| `--to-stage` | The last stage to be built during optimization. |
195+
| `-v` or `--verbose` | Verbosity Level. [0: Only ray status, 1: print stderr, 2: print stdout on top of what is in level 0 and 1. ] |
196+
| | |
197+
### List of Ray-specific input arguments
198+
| Argument | Description |
199+
|-------------------------------|-------------------------------------------------------------------------------------------------------|
161200
| `--resume` | Resume previous run. |
162201
| `--git_clean` | Clean binaries and build files. **WARNING**: may lose previous data. |
163202
| `--git_clone` | Force new git clone. **WARNING**: may lose previous data. |
@@ -176,12 +215,22 @@ GCP Setup Tutorial coming soon.
176215
| `--perturbation` | Perturbation interval for PopulationBasedTraining |
177216
| `--seed` | Random seed. |
178217
| `--jobs` | Max number of concurrent jobs. |
179-
| `--openroad_threads` | Max number of threads usable. |
180218
| `--server` | The address of Ray server to connect. |
181219
| `--port` | The port of Ray server to connect. |
182-
| `-v` or `--verbose` | Verbosity Level. [0: Only ray status, 1: print stderr, 2: print stdout on top of what is in level 0 and 1. ] |
183-
| | |
184-
### GUI
220+
221+
### List of Vizier-specific input arguments
222+
| Argument | Description |
223+
|-------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------|
224+
| `--orfs` | Path to the OpenROAD-flow-scripts repository |
225+
| `--results` | Path where JSON file with results will be saved |
226+
| `-a` or `--algorithm` | Algorithm for the optimization engine, one of GAUSSIAN_PROCESS_BANDIT, RANDOM_SEARCH, QUASI_RANDOM_SEARCH, GRID_SEARCH, SHUFFLED_GRID_SEARCH, NSGA2 |
227+
| `-m` or `--use-metrics` | Metrics to optimize, list of worst_slack, clk_period-worst_slack, total_power, core_util, final_util, design_area, core_area, die_area, last_successful_stage |
228+
| `-i` or `--iterations` | Max iteration count for the optimization engine |
229+
| `-s` or `--suggestions` | Suggestion count per iteration of the optimization engine |
230+
| `-w` or `--workers` | Number of parallel workers |
231+
| `--use-existing-server` | Address of the running Vizier server |
232+
233+
### GUI for optimizations with Ray Tune
185234

186235
Basically, progress is displayed at the terminal where you run, and when all runs are finished, the results are displayed.
187236
You could find the "Best config found" on the screen.
@@ -207,6 +256,7 @@ Assuming the virtual environment is setup at `./tools/AutoTuner/autotuner_env`:
207256
./tools/AutoTuner/setup.sh
208257
python3 ./tools/AutoTuner/test/smoke_test_sweep.py
209258
python3 ./tools/AutoTuner/test/smoke_test_tune.py
259+
python3 ./tools/AutoTuner/test/smoke_test_vizier.py
210260
python3 ./tools/AutoTuner/test/smoke_test_sample_iteration.py
211261
```
212262

0 commit comments

Comments
 (0)