Skip to content

Commit 5f72b6f

Browse files
committed
docs: user: Add information about Vizier in AutoTuner
Signed-off-by: Eryk Szpotanski <[email protected]>
1 parent e218cdd commit 5f72b6f

File tree

1 file changed

+70
-21
lines changed

1 file changed

+70
-21
lines changed

docs/user/InstructionsForAutoTuner.md

Lines changed: 70 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -5,20 +5,29 @@ AutoTuner provides a generic interface where users can define parameter configur
55
This enables AutoTuner to easily support various tools and flows. AutoTuner also utilizes [METRICS2.1](https://github.com/ieee-ceda-datc/datc-rdf-Metrics4ML) to capture PPA
66
of individual search trials. With the abundant features of METRICS2.1, users can explore various reward functions that steer the flow autotuning to different PPA goals.
77

8-
AutoTuner provides two main functionalities as follows.
9-
* Automatic hyperparameter tuning framework for OpenROAD-flow-script (ORFS)
10-
* Parametric sweeping experiments for ORFS
8+
AutoTuner provides three main functionalities as follows.
9+
* [Ray] Automatic hyperparameter tuning framework for OpenROAD-flow-script (ORFS)
10+
* [Ray] Parametric sweeping experiments for ORFS
11+
* [Vizier] Multi-objective optimization of ORFS parameters
1112

1213

1314
AutoTuner contains top-level Python script for ORFS, each of which implements a different search algorithm. Current supported search algorithms are as follows.
14-
* Random/Grid Search
15-
* Population Based Training ([PBT](https://www.deepmind.com/blog/population-based-training-of-neural-networks))
16-
* Tree Parzen Estimator ([HyperOpt](https://hyperopt.github.io/hyperopt))
17-
* Bayesian + Multi-Armed Bandit ([AxSearch](https://ax.dev/))
18-
* Tree Parzen Estimator + Covariance Matrix Adaptation Evolution Strategy ([Optuna](https://optuna.org/))
19-
* Evolutionary Algorithm ([Nevergrad](https://github.com/facebookresearch/nevergrad))
15+
* Ray (Single-objective optimization)
16+
* Random/Grid Search
17+
* Population Based Training ([PBT](https://www.deepmind.com/blog/population-based-training-of-neural-networks))
18+
* Tree Parzen Estimator ([HyperOpt](https://hyperopt.github.io/hyperopt))
19+
* Bayesian + Multi-Armed Bandit ([AxSearch](https://ax.dev/docs/bayesopt.html))
20+
* Tree Parzen Estimator + Covariance Matrix Adaptation Evolution Strategy ([Optuna](https://optuna.readthedocs.io/en/stable/reference/samplers/generated/optuna.samplers.TPESampler.html))
21+
* Evolutionary Algorithm ([Nevergrad](https://github.com/facebookresearch/nevergrad))
22+
* Vizier (Multi-objective optimization)
23+
* Random/Grid/Shuffled Search
24+
* Quasi Random Search ([quasi-random](https://developers.google.com/machine-learning/guides/deep-learning-tuning-playbook/quasi-random-search))
25+
* Gaussian Process Bandit ([GP-Bandit](https://acsweb.ucsd.edu/~shshekha/GPBandits.html))
26+
* Non-dominated Sorting Genetic Algorithm II ([NSGA-II](https://ieeexplore.ieee.org/document/996017))
2027

21-
User-defined coefficient values (`coeff_perform`, `coeff_power`, `coeff_area`) of three objectives to set the direction of tuning are written in the script. Each coefficient is expressed as a global variable at the `get_ppa` function in `PPAImprov` class in the script (`coeff_perform`, `coeff_power`, `coeff_area`). Efforts to optimize each of the objectives are proportional to the specified coefficients.
28+
For Ray algorithms, optimized function can be adjusted with user-defined coefficient values (`coeff_perform`, `coeff_power`, `coeff_area`) for three objectives to set the direction of tuning. They are defined in the [distributed.py sricpt](../../tools/AutoTuner/src/autotuner/distributed.py) in `get_ppa` method of `PPAImprov` class. Efforts to optimize each of the objectives are proportional to the specified coefficients.
29+
30+
Using Vizier algorithms, used can choose which metrics should be optimized with `--use-metrics` argument.
2231

2332

2433
## Setting up AutoTuner
@@ -28,8 +37,10 @@ that works in Python3.8 for installation and configuration of AutoTuner,
2837
as shown below:
2938

3039
```shell
31-
# Install prerequisites
40+
# Install prerequisites for both Ray Tune and Vizier
3241
./tools/AutoTuner/installer.sh
42+
# Or install prerequisites for `ray` or `vizier`
43+
./tools/AutoTuner/installer.sh vizier
3344

3445
# Start virtual environment
3546
./tools/AutoTuner/setup.sh
@@ -50,7 +61,8 @@ Alternatively, here is a minimal example to get started:
5061
1.0,
5162
3.7439
5263
],
53-
"step": 0
64+
"step": 0,
65+
"scale": "log"
5466
},
5567
"CORE_MARGIN": {
5668
"type": "int",
@@ -67,6 +79,7 @@ Alternatively, here is a minimal example to get started:
6779
* `"type"`: Parameter type ("float" or "int") for sweeping/tuning
6880
* `"minmax"`: Min-to-max range for sweeping/tuning. The unit follows the default value of each technology std cell library.
6981
* `"step"`: Parameter step within the minmax range. Step 0 for type "float" means continuous step for sweeping/tuning. Step 0 for type "int" means the constant parameter.
82+
* `"scale"`: Vizier-specific parameter setting [scaling type](https://oss-vizier.readthedocs.io/en/latest/guides/user/search_spaces.html#scaling), allowed values: `linear`, `log` and `rlog`.
7083

7184
## Tunable / sweepable parameters
7285

@@ -118,13 +131,21 @@ The order of the parameters matter. Arguments `--design`, `--platform` and
118131
`--config` are always required and should precede *mode*.
119132
```
120133

134+
The `autotuner.vizier` module integrates OpenROAD flow into the Vizier optimizer.
135+
It is used for multi-objective optimization with an additional features improving chance of finding valid parameters.
136+
Moreover, various algorithms are available for tuning parameters.
137+
138+
Each mode relies on user-specified search space that is
139+
defined by a `.json` file, they use the same syntax and format,
140+
though some features may not be available for sweeping.
141+
121142
```{note}
122143
The following commands should be run from `./tools/AutoTuner`.
123144
```
124145

125146
#### Tune only
126147

127-
* AutoTuner: `python3 -m autotuner.distributed tune -h`
148+
* Ray-based AutoTuner: `python3 -m autotuner.distributed tune -h`
128149

129150
Example:
130151

@@ -145,19 +166,39 @@ python3 -m autotuner.distributed --design gcd --platform sky130hd \
145166
sweep
146167
```
147168

169+
#### Multi-object optimization
170+
171+
* Vizier-based AutoTuner: `python3 -m autotuner.vizier -h`
172+
173+
Example:
174+
175+
```shell
176+
python3 -m autotuner.vizier --design gcd --platform sky130hd \
177+
--config ../../flow/designs/sky130hd/gcd/autotuner.json
178+
```
148179

149180
### Google Cloud Platform (GCP) distribution with Ray
150181

151182
GCP Setup Tutorial coming soon.
152183

153184

154-
### List of input arguments
185+
### List of common input arguments
155186
| Argument | Description | Default |
156187
|-------------------------------|-------------------------------------------------------------------------------------------------------|---------|
157188
| `--design` | Name of the design for Autotuning. ||
158189
| `--platform` | Name of the platform for Autotuning. ||
159190
| `--config` | Configuration file that sets which knobs to use for Autotuning. ||
160191
| `--experiment` | Experiment name. This parameter is used to prefix the FLOW_VARIANT and to set the Ray log destination.| test |
192+
| `--samples` | Number of samples for tuning. | 10 |
193+
| `--jobs` | Max number of concurrent jobs. | # of CPUs / 2 |
194+
| `--openroad_threads` | Max number of threads usable. | 16 |
195+
| `--timeout` | Time limit (in hours) for each trial run. | No limit |
196+
| `-v` or `--verbose` | Verbosity Level. [0: Only ray status, 1: print stderr, 2: print stdout on top of what is in level 0 and 1. ] | 0 |
197+
| | ||
198+
199+
### Input arguments specific to Ray
200+
| Argument | Description | Default |
201+
|-------------------------------|-------------------------------------------------------------------------------------------------------|---------|
161202
| `--git_clean` | Clean binaries and build files. **WARNING**: may lose previous data. ||
162203
| `--git_clone` | Force new git clone. **WARNING**: may lose previous data. ||
163204
| `--git_clone_args` | Additional git clone arguments. ||
@@ -166,16 +207,11 @@ GCP Setup Tutorial coming soon.
166207
| `--git_orfs_branch` | OpenROAD-flow-scripts branch to use. ||
167208
| `--git_url` | OpenROAD-flow-scripts repo URL to use. | [ORFS GitHub repo](https://github.com/The-OpenROAD-Project/OpenROAD-flow-scripts) |
168209
| `--build_args` | Additional arguments given to ./build_openroad.sh ||
169-
| `--samples` | Number of samples for tuning. | 10 |
170-
| `--jobs` | Max number of concurrent jobs. | # of CPUs / 2 |
171-
| `--openroad_threads` | Max number of threads usable. | 16 |
172210
| `--server` | The address of Ray server to connect. ||
173211
| `--port` | The port of Ray server to connect. | 10001 |
174-
| `--timeout` | Time limit (in hours) for each trial run. | No limit |
175-
| `-v` or `--verbose` | Verbosity Level. [0: Only ray status, 1: print stderr, 2: print stdout on top of what is in level 0 and 1. ] | 0 |
176212
| | ||
177213

178-
#### Input arguments specific to tune mode
214+
#### Input arguments specific to Ray tune mode
179215
The following input arguments are applicable for tune mode only.
180216

181217
| Argument | Description | Default |
@@ -190,7 +226,19 @@ The following input arguments are applicable for tune mode only.
190226
| `--resume` | Resume previous run. ||
191227
| | ||
192228

193-
### GUI
229+
### Input arguments specific to Vizier
230+
| Argument | Description | Default |
231+
|-------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------|---------|
232+
| `--orfs` | Path to the OpenROAD-flow-scripts repository ||
233+
| `--results` | Path where JSON file with results will be saved ||
234+
| `-a` or `--algorithm` | Algorithm for the optimization engine, one of GAUSSIAN_PROCESS_BANDIT, RANDOM_SEARCH, QUASI_RANDOM_SEARCH, GRID_SEARCH, SHUFFLED_GRID_SEARCH, NSGA2 | NSGA2 |
235+
| `-m` or `--use-metrics` | Metrics to optimize, list of worst_slack, clk_period-worst_slack, total_power, core_util, final_util, design_area, core_area, die_area, last_successful_stage | all available metrics |
236+
| `-i` or `--iterations` | Max iteration count for the optimization engine | 2 ||
237+
| `-s` or `--suggestions` | Suggestion count per iteration of the optimization engine | 5 ||
238+
| `--use-existing-server` | Address of the running Vizier server ||
239+
| | ||
240+
241+
### GUI for optimizations with Ray Tune
194242

195243
Basically, progress is displayed at the terminal where you run, and when all runs are finished, the results are displayed.
196244
You could find the "Best config found" on the screen.
@@ -216,6 +264,7 @@ Assuming the virtual environment is setup at `./tools/AutoTuner/autotuner_env`:
216264
./tools/AutoTuner/setup.sh
217265
python3 ./tools/AutoTuner/test/smoke_test_sweep.py
218266
python3 ./tools/AutoTuner/test/smoke_test_tune.py
267+
python3 ./tools/AutoTuner/test/smoke_test_vizier.py
219268
python3 ./tools/AutoTuner/test/smoke_test_sample_iteration.py
220269
```
221270

0 commit comments

Comments
 (0)