You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/user/InstructionsForAutoTuner.md
+68-18Lines changed: 68 additions & 18 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,20 +5,29 @@ AutoTuner provides a generic interface where users can define parameter configur
5
5
This enables AutoTuner to easily support various tools and flows. AutoTuner also utilizes [METRICS2.1](https://github.com/ieee-ceda-datc/datc-rdf-Metrics4ML) to capture PPA
6
6
of individual search trials. With the abundant features of METRICS2.1, users can explore various reward functions that steer the flow autotuning to different PPA goals.
7
7
8
-
AutoTuner provides two main functionalities as follows.
9
-
* Automatic hyperparameter tuning framework for OpenROAD-flow-script (ORFS)
10
-
* Parametric sweeping experiments for ORFS
8
+
AutoTuner provides three main functionalities as follows.
9
+
*[Ray] Automatic hyperparameter tuning framework for OpenROAD-flow-script (ORFS)
10
+
*[Ray] Parametric sweeping experiments for ORFS
11
+
*[Vizier] Multi-objective optimization of ORFS parameters
11
12
12
13
13
14
AutoTuner contains top-level Python script for ORFS, each of which implements a different search algorithm. Current supported search algorithms are as follows.
14
-
* Random/Grid Search
15
-
* Population Based Training ([PBT](https://www.deepmind.com/blog/population-based-training-of-neural-networks))
16
-
* Tree Parzen Estimator ([HyperOpt](https://hyperopt.github.io/hyperopt))
* Quasi Random Search ([quasi-random](https://developers.google.com/machine-learning/guides/deep-learning-tuning-playbook/quasi-random-search))
25
+
* Gaussian Process Bandit ([GP-Bandit](https://acsweb.ucsd.edu/~shshekha/GPBandits.html))
26
+
* Non-dominated Sorting Genetic Algorithm II ([NSGA-II](https://ieeexplore.ieee.org/document/996017))
20
27
21
-
User-defined coefficient values (`coeff_perform`, `coeff_power`, `coeff_area`) of three objectives to set the direction of tuning are written in the script. Each coefficient is expressed as a global variable at the `get_ppa` function in `PPAImprov` class in the script (`coeff_perform`, `coeff_power`, `coeff_area`). Efforts to optimize each of the objectives are proportional to the specified coefficients.
28
+
For Ray algorithms, optimized function can be adjusted with user-defined coefficient values (`coeff_perform`, `coeff_power`, `coeff_area`) for three objectives to set the direction of tuning. They are defined in the [distributed.py sricpt](../../tools/AutoTuner/src/autotuner/distributed.py) in `get_ppa` method of `PPAImprov` class. Efforts to optimize each of the objectives are proportional to the specified coefficients.
29
+
30
+
Using Vizier algorithms, used can choose which metrics should be optimized with `--use-metrics` argument.
22
31
23
32
24
33
## Setting up AutoTuner
@@ -28,8 +37,10 @@ that works in Python3.8 for installation and configuration of AutoTuner,
28
37
as shown below:
29
38
30
39
```shell
31
-
# Install prerequisites
40
+
# Install prerequisites for both Ray Tune and Vizier
32
41
./tools/AutoTuner/installer.sh
42
+
# Or install prerequisites for `ray` or `vizier`
43
+
./tools/AutoTuner/installer.sh vizier
33
44
34
45
# Start virtual environment
35
46
./tools/AutoTuner/setup.sh
@@ -50,7 +61,8 @@ Alternatively, here is a minimal example to get started:
50
61
1.0,
51
62
3.7439
52
63
],
53
-
"step": 0
64
+
"step": 0,
65
+
"scale": "log"
54
66
},
55
67
"CORE_MARGIN": {
56
68
"type": "int",
@@ -67,6 +79,7 @@ Alternatively, here is a minimal example to get started:
67
79
*`"type"`: Parameter type ("float" or "int") for sweeping/tuning
68
80
*`"minmax"`: Min-to-max range for sweeping/tuning. The unit follows the default value of each technology std cell library.
69
81
*`"step"`: Parameter step within the minmax range. Step 0 for type "float" means continuous step for sweeping/tuning. Step 0 for type "int" means the constant parameter.
|`--orfs`| Path to the OpenROAD-flow-scripts repository |
225
+
|`--results`| Path where JSON file with results will be saved |
226
+
|`-a` or `--algorithm`| Algorithm for the optimization engine, one of GAUSSIAN_PROCESS_BANDIT, RANDOM_SEARCH, QUASI_RANDOM_SEARCH, GRID_SEARCH, SHUFFLED_GRID_SEARCH, NSGA2 |
227
+
|`-m` or `--use-metrics`| Metrics to optimize, list of worst_slack, clk_period-worst_slack, total_power, core_util, final_util, design_area, core_area, die_area, last_successful_stage |
228
+
|`-i` or `--iterations`| Max iteration count for the optimization engine |
229
+
|`-s` or `--suggestions`| Suggestion count per iteration of the optimization engine |
230
+
|`-w` or `--workers`| Number of parallel workers |
231
+
|`--use-existing-server`| Address of the running Vizier server |
232
+
233
+
### GUI for optimizations with Ray Tune
185
234
186
235
Basically, progress is displayed at the terminal where you run, and when all runs are finished, the results are displayed.
187
236
You could find the "Best config found" on the screen.
@@ -207,6 +256,7 @@ Assuming the virtual environment is setup at `./tools/AutoTuner/autotuner_env`:
0 commit comments