Skip to content

Commit 7606207

Browse files
authored
Add note on accessing last and best checkpoint (#21241)
1 parent 9d40bc8 commit 7606207

File tree

5 files changed

+16
-10
lines changed

5 files changed

+16
-10
lines changed

.github/workflows/probot-check-group.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,14 +12,14 @@ jobs:
1212
required-jobs:
1313
runs-on: ubuntu-latest
1414
if: github.event.pull_request.draft == false
15-
timeout-minutes: 61 # in case something is wrong with the internal timeout
15+
timeout-minutes: 71 # in case something is wrong with the internal timeout
1616
steps:
1717
- uses: Lightning-AI/[email protected]
1818
env:
1919
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
2020
with:
2121
job: check-group
2222
interval: 180 # seconds
23-
timeout: 60 # minutes
23+
timeout: 70 # minutes
2424
maintainers: "Lightning-AI/lai-frameworks"
2525
owner: "carmocca"

.lightning/workflows/fabric.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ trigger:
44
pull_request:
55
branches: ["master", "release/stable"]
66

7-
timeout: "55" # minutes
7+
timeout: "60" # minutes
88
parametrize:
99
matrix: {}
1010
include:

.lightning/workflows/pytorch.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ trigger:
44
pull_request:
55
branches: ["master", "release/stable"]
66

7-
timeout: "55" # minutes
7+
timeout: "60" # minutes
88
parametrize:
99
matrix: {}
1010
include:

docs/source-pytorch/common/checkpointing_intermediate.rst

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,13 @@ For fine-grained control over checkpointing behavior, use the :class:`~lightning
2121
checkpoint_callback = ModelCheckpoint(dirpath="my/path/", save_top_k=2, monitor="val_loss")
2222
trainer = Trainer(callbacks=[checkpoint_callback])
2323
trainer.fit(model)
24-
checkpoint_callback.best_model_path
24+
25+
# Access best and last model checkpoint directly from the callback
26+
print(checkpoint_callback.best_model_path)
27+
print(checkpoint_callback.last_model_path)
28+
# Or via the trainer
29+
print(trainer.checkpoint_callback.best_model_path)
30+
print(trainer.checkpoint_callback.last_model_path)
2531
2632
Any value that has been logged via *self.log* in the LightningModule can be monitored.
2733

src/lightning/pytorch/callbacks/model_checkpoint.py

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -204,11 +204,11 @@ class ModelCheckpoint(Checkpoint):
204204
... )
205205
206206
# retrieve the best checkpoint after training
207-
checkpoint_callback = ModelCheckpoint(dirpath='my/path/')
208-
trainer = Trainer(callbacks=[checkpoint_callback])
209-
model = ...
210-
trainer.fit(model)
211-
checkpoint_callback.best_model_path
207+
>>> checkpoint_callback = ModelCheckpoint(dirpath='my/path/')
208+
>>> trainer = Trainer(callbacks=[checkpoint_callback])
209+
>>> model = ... # doctest: +SKIP
210+
>>> trainer.fit(model) # doctest: +SKIP
211+
>>> print(checkpoint_callback.best_model_path) # doctest: +SKIP
212212
213213
.. tip:: Saving and restoring multiple checkpoint callbacks at the same time is supported under variation in the
214214
following arguments:

0 commit comments

Comments
 (0)