|
1 | | -# MDM DC verification – quick guide |
2 | 1 |
|
3 | | -This script runs the MDM aggregation → DC simulation → envelope checks using the YAML configs for each device. |
4 | | -## Environment setup |
| 2 | +# MDM DC Verification – Quick Guide |
5 | 3 |
|
6 | | -Before running, make sure you have: |
| 4 | +This script runs the flow: |
7 | 5 |
|
8 | | -* **Python 3.8+** installed |
9 | | -* **ngspice** available on your system (required for simulations) |
10 | | -* **openvaf** installed and in your PATH (needed to compile Verilog-A models) |
| 6 | +**MDM aggregation → DC simulation → Envelope checks** |
| 7 | +using the YAML configs for each device. |
11 | 8 |
|
12 | | -### Build OSDI files (once per repo checkout) |
| 9 | + |
| 10 | + |
| 11 | +## 1. Environment Setup |
| 12 | + |
| 13 | +### Requirements |
| 14 | +- **Python 3.8+** |
| 15 | +- **ngspice** (required for simulations) |
| 16 | +- **openvaf** in your `PATH` (needed to compile Verilog-A models) |
| 17 | + |
| 18 | +### Build OSDI Files (once per repo checkout) |
13 | 19 |
|
14 | 20 | ```bash |
15 | 21 | cd <repo path>/ihp-sg13g2/libs.tech/verilog-a |
16 | 22 | chmod +x openvaf-compile-va.sh |
17 | 23 | ./openvaf-compile-va.sh |
| 24 | +```` |
18 | 25 |
|
19 | | -# Verify OSDI files were built |
20 | | -ls -la <repo path>/ihp-sg13g2/libs.tech/ngspice/osdi |
21 | | -``` |
22 | | -### Create a Python virtual environment and install dependencies |
| 26 | +### Create Virtual Environment & Install Dependencies |
23 | 27 |
|
24 | 28 | ```bash |
25 | 29 | # Create and activate venv |
26 | 30 | python3 -m venv .venv |
27 | 31 | source .venv/bin/activate |
28 | 32 |
|
29 | | - |
30 | | -# Install required Python packages |
| 33 | +# Install required packages |
31 | 34 | pip install pandas pyyaml jinja2 pytest |
32 | 35 | ``` |
33 | 36 |
|
34 | | -## Run commands |
| 37 | +--- |
| 38 | + |
| 39 | +## 2. Running Tests |
35 | 40 |
|
36 | | -> Run from the **devices** folder so the relative paths in the configs resolve correctly. |
| 41 | +> Always run from the **devices** folder so relative paths in configs resolve correctly. |
37 | 42 |
|
38 | 43 | ```bash |
39 | 44 | cd ihp-sg13g2/libs.tech/ngspice/testing/devices |
40 | 45 | ``` |
41 | | -```bash |
42 | | -export IHP_OPEN_REPO=<repo path> |
43 | | -``` |
44 | | -### NMOS (sg13_lv_nmos) |
45 | | -```bash |
46 | | -python3 -m models_verifier.models_verifier -c mos/nmos_lv/sg13_lv_nmos.yaml |
47 | | -``` |
48 | 46 |
|
49 | | -### PMOS (sg13_lv_pmos) |
| 47 | +### Run a Single Device |
50 | 48 |
|
51 | | -```bash |
52 | | -python3 -m models_verifier.models_verifier -c mos/pmos_lv/sg13_lv_pmos.yaml |
53 | | -``` |
| 49 | +* **NMOS (sg13\_lv\_nmos)** |
| 50 | + |
| 51 | + ```bash |
| 52 | + python3 -m models_verifier.models_verifier -c mos/nmos_lv/sg13_lv_nmos.yaml |
| 53 | + ``` |
54 | 54 |
|
55 | | -## Outputs |
| 55 | +* **PMOS (sg13\_lv\_pmos)** |
56 | 56 |
|
57 | | -When a run finishes you will see: |
| 57 | + ```bash |
| 58 | + python3 -m models_verifier.models_verifier -c mos/pmos_lv/sg13_lv_pmos.yaml |
| 59 | + ``` |
58 | 60 |
|
59 | | -1. Per-setup merged CSVs (for debugging/inspection) |
| 61 | +(Other devices: `nmos_hv`, `pmos_hv`, `pnp_mpa`, `npn13g2`, `npn13g2l`, `npn13g2v`) |
60 | 62 |
|
61 | | -- Directory: `<output_dir>/sim_merged/` |
62 | | -- One CSV per discovered sweep/setup (filename derived from `master_setup_type`) |
63 | 63 |
|
64 | | -2. Reports (per the `output_dir` in the YAML) |
| 64 | +## 3. Outputs |
65 | 65 |
|
66 | | -- **Full summary:** `<output_dir>/full_report.csv` |
| 66 | +When a run finishes, you will see: |
| 67 | + |
| 68 | +### 3.1 Per-setup merged CSVs (for debugging/inspection) |
| 69 | + |
| 70 | +* Location: `<output_dir>/sim_merged/` |
| 71 | +* One CSV per discovered sweep/setup |
| 72 | + (filename derived from `master_setup_type`) |
| 73 | + |
| 74 | +### 3.2 Reports (per `output_dir` in YAML) |
| 75 | + |
| 76 | +* **Full summary:** `<output_dir>/full_report.csv` |
67 | 77 | One row per `(block_id, metric, target)` with counts and pass/fail. |
68 | | -- **Roll-up summary:** `<output_dir>/summary.csv` |
| 78 | + |
| 79 | +* **Roll-up summary:** `<output_dir>/summary.csv` |
69 | 80 | Aggregated by `(metric, target)` with overall out-of-bounds percentages. |
70 | | -- **Detailed failures:** `<output_dir>/detailed_failures.csv` _(only if there are any)_ |
71 | | - Row per failing point with value, bounds, and basic context. |
72 | 81 |
|
73 | | -The script also prints a short summary block to the terminal, including total cases, per-target pass/fail counts, and the number of failing points. |
| 82 | +* **Detailed failures:** `<output_dir>/detailed_failures.csv` |
| 83 | + Only written if failures exist. |
| 84 | + Row per failing point with value, bounds, and context. |
| 85 | + |
| 86 | +Additionally, the script prints a **summary block to the terminal**, including: |
74 | 87 |
|
75 | | -## Exit status |
| 88 | +* Total cases |
| 89 | +* Per-target pass/fail counts |
| 90 | +* Number of failing points |
76 | 91 |
|
77 | | -- `0` → all selected targets in the run passed their thresholds |
78 | | -- `1` → one or more groups failed (reports still written) |
79 | 92 |
|
80 | | -(Other non-zero exit codes indicate early termination before reporting.) |
| 93 | +## 4. Exit Status Codes |
81 | 94 |
|
82 | | -## Interpreting pytest assertion failures |
| 95 | +* **0** → All selected targets passed thresholds |
| 96 | +* **1** → One or more groups failed (reports still written) |
| 97 | +* **Other non-zero** → Early termination before reporting |
83 | 98 |
|
84 | | -When you run the tests , a device test fails if any checked group exceeds the configured out-of-range threshold. In that case, pytest raises an **AssertionError** and prints lines like: |
| 99 | +--- |
| 100 | + |
| 101 | +## 5. Interpreting Pytest Failures |
| 102 | + |
| 103 | +When using pytest, a device test fails if any checked group exceeds the configured out-of-range threshold. Example output: |
85 | 104 |
|
86 | 105 | ``` |
87 | 106 | check failed: |
88 | 107 | [id/meas] (FAIL file=/path/to/meas.mdm, block_index=42) n=120 out_of_bounds=7 (5.83%) |
89 | 108 | [ib/tt] (FAIL file=/path/to/meas.mdm, block_index=7) n=95 out_of_bounds=6 (6.32%) |
90 | | -... |
91 | 109 |
|
92 | 110 | STATUS: 2/24 groups FAILED (8.33%); pass rate = 91.67% |
93 | 111 | - id/meas: 1 fails |
94 | | - - ib/tt: 1 fails |
| 112 | + - ib/tt: 1 fails |
95 | 113 | ``` |
96 | 114 |
|
97 | | -### What each part means |
| 115 | +### Meaning of Each Part |
| 116 | + |
| 117 | +* `[metric/target]` — Metric and target (`meas` = measured, `tt` = typical). |
| 118 | +* `file=...` — Source MDM file. |
| 119 | +* `block_index=...` — Block index inside that file. |
| 120 | +* `n=` — Total points in that group. |
| 121 | +* `out_of_bounds=` — Number outside allowed envelope. |
| 122 | +* `(%)` — Percent of failing points. |
| 123 | +* `STATUS:` — Summary across groups. |
| 124 | +* Bullet list — Per-metric breakdown of failures. |
98 | 125 |
|
99 | | -- `[metric/target]` — which metric and target failed (`meas` = measured, `tt` = typical corner). |
100 | | -- `file=...` — the source mdm that produced the failing group (if available). |
101 | | -- `block_index=...` — the block index within that file (if available). |
102 | | -- `n=` — total points evaluated in that group. |
103 | | -- `out_of_bounds=` — number of points outside the allowed envelope. |
104 | | -- `(%)` — percent of points outside the envelope for that group. |
105 | | -- `STATUS:` — summary across all groups: how many failed vs. total, plus pass rate. |
106 | | -- Per-metric lines (`- <metric>/<target>: <count> fails`) — quick breakdown by metric/target. |
| 126 | +--- |
107 | 127 |
|
108 | | -## Run locally |
| 128 | +## 6. Running Locally (CI-style) |
109 | 129 |
|
110 | | -### Run a single device case (same as CI) |
| 130 | +### Run a Single Device with Pytest |
111 | 131 |
|
112 | 132 | ```bash |
113 | 133 | cd ihp-sg13g2/libs.tech/ngspice/testing/devices |
114 | 134 | python3 -m pytest --tb=short -p no:capture \ |
115 | 135 | 'tests/test_devices.py::test_devices[nmos_lv]' |
116 | 136 | ``` |
117 | 137 |
|
118 | | -Replace `nmos_lv` with any of: `pmos_lv`, `nmos_hv`, `pmos_hv`, `pnp_mpa`, `npn13g2`, `npn13g2l`, `npn13g2h`. |
119 | | - |
| 138 | +Replace `nmos_lv` with any of: |
| 139 | +`pmos_lv`, `nmos_hv`, `pmos_hv`, `pnp_mpa`, `npn13g2`, `npn13g2l`, `npn13g2v` |
120 | 140 |
|
121 | | -### Run all tests |
| 141 | +### Run All Devices |
122 | 142 |
|
123 | 143 | ```bash |
124 | 144 | cd ihp-sg13g2/libs.tech/ngspice/testing/devices |
125 | 145 | python3 -m pytest --tb=short -p no:capture tests/test_devices.py |
126 | 146 | ``` |
127 | 147 |
|
128 | | ---- |
129 | | - |
|
0 commit comments