Skip to content

Commit d62adf3

Browse files
committed
docs: user: Extend AutoTuner docs with Vizier information
Signed-off-by: Eryk Szpotanski <[email protected]>
1 parent 6709fcd commit d62adf3

File tree

1 file changed

+75
-29
lines changed

1 file changed

+75
-29
lines changed

docs/user/InstructionsForAutoTuner.md

Lines changed: 75 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -5,25 +5,34 @@ AutoTuner provides a generic interface where users can define parameter configur
55
This enables AutoTuner to easily support various tools and flows. AutoTuner also utilizes [METRICS2.1](https://github.com/ieee-ceda-datc/datc-rdf-Metrics4ML) to capture PPA
66
of individual search trials. With the abundant features of METRICS2.1, users can explore various reward functions that steer the flow autotuning to different PPA goals.
77

8-
AutoTuner provides two main functionalities as follows.
9-
* Automatic hyperparameter tuning framework for OpenROAD-flow-script (ORFS)
10-
* Parametric sweeping experiments for ORFS
8+
AutoTuner provides three main functionalities as follows.
9+
* [Ray] Automatic hyperparameter tuning framework for OpenROAD-flow-script (ORFS)
10+
* [Ray] Parametric sweeping experiments for ORFS
11+
* [Vizier] Multi-objective optimization of ORFS parameters
1112

1213

1314
AutoTuner contains top-level Python script for ORFS, each of which implements a different search algorithm. Current supported search algorithms are as follows.
14-
* Random/Grid Search
15-
* Population Based Training ([PBT](https://www.deepmind.com/blog/population-based-training-of-neural-networks))
16-
* Tree Parzen Estimator ([HyperOpt](https://hyperopt.github.io/hyperopt))
17-
* Bayesian + Multi-Armed Bandit ([AxSearch](https://ax.dev/))
18-
* Tree Parzen Estimator + Covariance Matrix Adaptation Evolution Strategy ([Optuna](https://optuna.org/))
19-
* Evolutionary Algorithm ([Nevergrad](https://github.com/facebookresearch/nevergrad))
15+
* Ray (Single-objective optimization)
16+
* Random/Grid Search
17+
* Population Based Training ([PBT](https://www.deepmind.com/blog/population-based-training-of-neural-networks))
18+
* Tree Parzen Estimator ([HyperOpt](https://hyperopt.github.io/hyperopt))
19+
* Bayesian + Multi-Armed Bandit ([AxSearch](https://ax.dev/docs/bayesopt.html))
20+
* Tree Parzen Estimator + Covariance Matrix Adaptation Evolution Strategy ([Optuna](https://optuna.readthedocs.io/en/stable/reference/samplers/generated/optuna.samplers.TPESampler.html))
21+
* Evolutionary Algorithm ([Nevergrad](https://github.com/facebookresearch/nevergrad))
22+
* Vizier (Multi-objective optimization)
23+
* Random/Grid/Shuffled Search
24+
* Quasi Random Search ([quasi-random](https://developers.google.com/machine-learning/guides/deep-learning-tuning-playbook/quasi-random-search))
25+
* Gaussian Process Bandit ([GP-Bandit](https://acsweb.ucsd.edu/~shshekha/GPBandits.html))
26+
* Non-dominated Sorting Genetic Algorithm II ([NSGA-II](https://ieeexplore.ieee.org/document/996017))
2027

21-
User-defined coefficient values (`coeff_perform`, `coeff_power`, `coeff_area`) of three objectives to set the direction of tuning are written in the script. Each coefficient is expressed as a global variable at the `get_ppa` function in `PPAImprov` class in the script (`coeff_perform`, `coeff_power`, `coeff_area`). Efforts to optimize each of the objectives are proportional to the specified coefficients.
28+
For Ray algorithms, optimized function can be adjusted with user-defined coefficient values (`coeff_perform`, `coeff_power`, `coeff_area`) for three objectives to set the direction of tuning. They are defined in the [distributed.py sricpt](../../tools/AutoTuner/src/autotuner/distributed.py) in `get_ppa` method of `PPAImprov` class. Efforts to optimize each of the objectives are proportional to the specified coefficients.
29+
30+
Using Vizier algorithms, used can choose which metrics should be optimized with `--use-metrics` argument.
2231

2332

2433
## Setting up AutoTuner
2534

26-
We have provided two convenience scripts, `./install.sh` and `./setup.sh`
35+
We have provided two convenience scripts, `./installer.sh` and `./setup.sh`
2736
that works in Python3.8 for installation and configuration of AutoTuner,
2837
as shown below:
2938

@@ -32,8 +41,10 @@ Make sure you run the following commands in `./tools/AutoTuner/src/autotuner`.
3241
```
3342

3443
```shell
35-
# Install prerequisites
36-
./tools/AutoTuner/install.sh
44+
# Install prerequisites for both Ray Tune and Vizier
45+
./tools/AutoTuner/installer.sh
46+
# Or install prerequisites for `ray` or `vizier`
47+
./tools/AutoTuner/installer.sh vizier
3748

3849
# Start virtual environment
3950
./tools/AutoTuner/setup.sh
@@ -54,7 +65,8 @@ Alternatively, here is a minimal example to get started:
5465
1.0,
5566
3.7439
5667
],
57-
"step": 0
68+
"step": 0,
69+
"scale": "log"
5870
},
5971
"CORE_MARGIN": {
6072
"type": "int",
@@ -71,6 +83,7 @@ Alternatively, here is a minimal example to get started:
7183
* `"type"`: Parameter type ("float" or "int") for sweeping/tuning
7284
* `"minmax"`: Min-to-max range for sweeping/tuning. The unit follows the default value of each technology std cell library.
7385
* `"step"`: Parameter step within the minmax range. Step 0 for type "float" means continuous step for sweeping/tuning. Step 0 for type "int" means the constant parameter.
86+
* `"scale"`: Vizier-specific parameter setting [scaling type](https://oss-vizier.readthedocs.io/en/latest/guides/user/search_spaces.html#scaling), allowed values: `linear`, `log` and `rlog`.
7487

7588
## Tunable / sweepable parameters
7689

@@ -104,7 +117,7 @@ For Global Routing parameters that are set on `fastroute.tcl` you can use:
104117

105118
### General Information
106119

107-
The `distributed.py` script uses Ray's job scheduling and management to
120+
The `autotuner.distributed` module uses Ray's job scheduling and management to
108121
fully utilize available hardware resources from a single server
109122
configuration, on-premies or over the cloud with multiple CPUs.
110123
The two modes of operation: `sweep`, where every possible parameter
@@ -114,51 +127,76 @@ hyperparameters using one of the algorithms listed above. The `sweep`
114127
mode is useful when we want to isolate or test a single or very few
115128
parameters. On the other hand, `tune` is more suitable for finding
116129
the best combination of a complex and large number of flow
117-
parameters. Both modes rely on user-specified search space that is
118-
defined by a `.json` file, they use the same syntax and format,
119-
though some features may not be available for sweeping.
130+
parameters.
120131

121132
```{note}
122133
The order of the parameters matter. Arguments `--design`, `--platform` and
123134
`--config` are always required and should precede <mode>.
124135
```
125136

137+
The `autotuner.vizier` module integrates OpenROAD flow into the Vizier optimizer.
138+
It is used for multi-objective optimization with an additional features improving chance of finding valid parameters.
139+
Moreover, various algorithms are available for tuning parameters.
140+
141+
Each mode relies on user-specified search space that is
142+
defined by a `.json` file, they use the same syntax and format,
143+
though some features may not be available for sweeping.
144+
126145
#### Tune only
127146

128-
* AutoTuner: `python3 distributed.py tune -h`
147+
* Ray-based AutoTuner: `python3 -m autotuner.distributed tune -h`
129148

130149
Example:
131150

132151
```shell
133-
python3 distributed.py --design gcd --platform sky130hd \
152+
python3 -m autotuner.distributed --design gcd --platform sky130hd \
134153
--config ../../../../flow/designs/sky130hd/gcd/autotuner.json \
135154
tune --samples 5
136155
```
137156
#### Sweep only
138157

139-
* Parameter sweeping: `python3 distributed.py sweep -h`
158+
* Parameter sweeping: `python3 -m autotuner.distributed sweep -h`
140159

141160
Example:
142161

143162
```shell
144-
python3 distributed.py --design gcd --platform sky130hd \
163+
python3 -m autotuner.distributed --design gcd --platform sky130hd \
145164
--config distributed-sweep-example.json \
146165
sweep
147166
```
148167

168+
#### Multi-object optimization
169+
170+
* Vizier-based AutoTuner: `python3 -m autotuner.vizier -h`
171+
172+
Example:
173+
174+
```shell
175+
python3 -m autotuner.vizier --design gcd --platform sky130hd \
176+
--config ../../flow/designs/sky130hd/gcd/autotuner.json
177+
```
149178

150179
### Google Cloud Platform (GCP) distribution with Ray
151180

152181
GCP Setup Tutorial coming soon.
153182

154183

155-
### List of input arguments
184+
### List of common input arguments
156185
| Argument | Description |
157186
|-------------------------------|-------------------------------------------------------------------------------------------------------|
158187
| `--design` | Name of the design for Autotuning. |
159188
| `--platform` | Name of the platform for Autotuning. |
160189
| `--config` | Configuration file that sets which knobs to use for Autotuning. |
161190
| `--experiment` | Experiment name. This parameter is used to prefix the FLOW_VARIANT and to set the Ray log destination.|
191+
| `--algorithm` | Search algorithm to use for Autotuning. |
192+
| `--openroad_threads` | Max number of threads usable. |
193+
| `--to-stage` | The last stage to be built during optimization. |
194+
| `-v` or `--verbose` | Verbosity Level. [0: Only ray status, 1: print stderr, 2: print stdout on top of what is in level 0 and 1. ] |
195+
| | |
196+
### List of Ray-specific input arguments
197+
| Argument | Description |
198+
|-------------------------------|-------------------------------------------------------------------------------------------------------|
199+
| `--eval` | Evalaute function to use with search algorithm. \ |
162200
| `--resume` | Resume previous run. |
163201
| `--git_clean` | Clean binaries and build files. **WARNING**: may lose previous data. |
164202
| `--git_clone` | Force new git clone. **WARNING**: may lose previous data. |
@@ -168,21 +206,29 @@ GCP Setup Tutorial coming soon.
168206
| `--git_orfs_branch` | OpenROAD-flow-scripts branch to use. |
169207
| `--git_url` | OpenROAD-flow-scripts repo URL to use. |
170208
| `--build_args` | Additional arguments given to ./build_openroad.sh |
171-
| `--algorithm` | Search algorithm to use for Autotuning. |
172-
| `--eval` | Evalaute function to use with search algorithm. \ |
173209
| `--samples` | Number of samples for tuning. |
174210
| `--iterations` | Number of iterations for tuning. |
175211
| `--resources_per_trial` | Number of CPUs to request for each tuning job. |
176212
| `--reference` | Reference file for use with PPAImprov. |
177213
| `--perturbation` | Perturbation interval for PopulationBasedTraining |
178214
| `--seed` | Random seed. |
179215
| `--jobs` | Max number of concurrent jobs. |
180-
| `--openroad_threads` | Max number of threads usable. |
181216
| `--server` | The address of Ray server to connect. |
182217
| `--port` | The port of Ray server to connect. |
183-
| `-v` or `--verbose` | Verbosity Level. [0: Only ray status, 1: print stderr, 2: print stdout on top of what is in level 0 and 1. ] |
184-
| | |
185-
### GUI
218+
219+
### List of Vizier-specific input arguments
220+
| Argument | Description |
221+
|-------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------|
222+
| `--orfs` | Path to the OpenROAD-flow-scripts repository |
223+
| `--results` | Path where JSON file with results will be saved |
224+
| `-a` or `--algorithm` | Algorithm for the optimization engine, one of GAUSSIAN_PROCESS_BANDIT, RANDOM_SEARCH, QUASI_RANDOM_SEARCH, GRID_SEARCH, SHUFFLED_GRID_SEARCH, NSGA2 |
225+
| `-m` or `--use-metrics` | Metrics to optimize, list of worst_slack, clk_period-worst_slack, total_power, core_util, final_util, design_area, core_area, die_area, last_successful_stage |
226+
| `-i` or `--iterations` | Max iteration count for the optimization engine |
227+
| `-s` or `--suggestions` | Suggestion count per iteration of the optimization engine |
228+
| `-w` or `--workers` | Number of parallel workers |
229+
| `--use-existing-server` | Address of the running Vizier server |
230+
231+
### GUI for optimizations with Ray Tune
186232

187233
Basically, progress is displayed at the terminal where you run, and when all runs are finished, the results are displayed.
188234
You could find the "Best config found" on the screen.

0 commit comments

Comments
 (0)