Skip to content

Finetuning #134

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 26 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
799d5d9
finetuning
DavideTisi May 14, 2025
2874310
exctract energy
DavideTisi May 14, 2025
07819c4
I hate the linter so much
DavideTisi May 14, 2025
08bec01
bug
DavideTisi May 14, 2025
43d49ef
typo
DavideTisi May 14, 2025
b5bc1ff
rename trainingset
DavideTisi May 14, 2025
cfcaa6b
naming fix
DavideTisi May 14, 2025
f6d3d72
Merge branch 'main' into finetuning
ceriottm Jun 11, 2025
aea219a
Renaming & linking
ceriottm Jun 13, 2025
42716e8
Fine-tuning example
ceriottm Jun 15, 2025
a7d3a21
linter
DavideTisi Jun 17, 2025
b702962
correct githubhandle
DavideTisi Jun 17, 2025
a0bc50c
Merge branch 'main' into finetuning
sofiia-chorna Jul 1, 2025
b64d9d5
Make CI not fail
sofiia-chorna Jul 1, 2025
9e97074
Improvement, clean up
sofiia-chorna Jul 1, 2025
369d296
Add minimal examples of lora and direct force fine-tuning
sofiia-chorna Jul 2, 2025
3dc1f68
Fix spellings
sofiia-chorna Jul 2, 2025
e8cf322
Add eval, chemiscope with full ft result
sofiia-chorna Jul 2, 2025
f6e9ffd
Add example of NC learning with C force funetuning
sofiia-chorna Jul 3, 2025
7925674
Reduce number of epochs, dataset size, show results in img
sofiia-chorna Jul 4, 2025
7e144a4
Reduce n_epochs more
sofiia-chorna Jul 4, 2025
82f9c1d
Add forces to full ft chemiscope
sofiia-chorna Jul 10, 2025
61d8765
Add learning curve plot for ex1
sofiia-chorna Jul 10, 2025
65b73a9
Add losses in ex1
sofiia-chorna Jul 11, 2025
ccb3861
Show improvement on the background
sofiia-chorna Jul 11, 2025
c5e8c00
Small text impovements
sofiia-chorna Jul 11, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/src/software/metatensor.sec
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ training and evaluating ML models.
- examples/learn-tensors-with-mcov/learn-tensors-with-mcov
- examples/pet-mad/pet-mad
- examples/pet-mad-nc/pet-mad-nc
- examples/pet-finetuning/pet-ft
- examples/flashmd/flashmd-demo
- examples/shiftml/shiftml-example
- examples/hamiltonian-qm7/hamiltonian-qm7
1 change: 1 addition & 0 deletions docs/src/topics/ml-models.sec
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ data.
- examples/learn-tensors-with-mcov/learn-tensors-with-mcov
- examples/pet-mad/pet-mad
- examples/pet-mad-nc/pet-mad-nc
- examples/pet-finetuning/pet-ft
- examples/flashmd/flashmd-demo
- examples/shiftml/shiftml-example
- examples/hamiltonian-qm7/hamiltonian-qm7
14 changes: 14 additions & 0 deletions examples/pet-finetuning/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
*.data
*.pt
*.ckpt
log.lammps
trajectory.xyz
tmp*

extensions/*

outputs/
data/ethanol_train.xyz
data/ethanol_val.xyz
data/ethanol_test.xyz
output.xyz
4 changes: 4 additions & 0 deletions examples/pet-finetuning/README.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
PET-MAD finetuning tutorial
===========================

An example of finetuning the PET-MAD universal machine-learning potential.
38 changes: 38 additions & 0 deletions examples/pet-finetuning/basic_ft_options.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
seed: 42

architecture:
name: "pet"
training:
num_epochs: 10 # very short period for demostration
num_epochs_warmup: 1
learning_rate: 1e-5 # small learning rate to stabilize training
finetune:
method: "full" # use fine-tuning strategy
read_from: pet-mad-latest.ckpt # path to the pretrained checkpoint to start from

training_set:
systems:
read_from: "data/ethanol_train.xyz" # path to the finetuning dataset
length_unit: angstrom
targets:
energy:
key: "energy-corrected" # name of the target value
unit: "eV"

validation_set:
systems:
read_from: "data/ethanol_val.xyz"
length_unit: angstrom
targets:
energy:
key: "energy-corrected"
unit: "eV"

test_set:
systems:
read_from: "data/ethanol_test.xyz"
length_unit: angstrom
targets:
energy:
key: "energy-corrected"
unit: "eV"
57 changes: 57 additions & 0 deletions examples/pet-finetuning/c_ft_options.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
seed: 42

architecture:
name: pet
training:
batch_size: 16
num_epochs: 5
num_epochs_warmup: 1
learning_rate: 1e-5

training_set:
systems:
read_from: data/ethanol_train.xyz
length_unit: angstrom
targets:
energy:
key: energy
unit: eV
forces: on
non_conservative_forces:
key: forces
type:
cartesian:
rank: 1
per_atom: true

validation_set:
systems:
read_from: data/ethanol_val.xyz
length_unit: angstrom
targets:
energy:
key: energy
unit: eV
forces: on
non_conservative_forces:
key: forces
type:
cartesian:
rank: 1
per_atom: true

test_set:
systems:
read_from: data/ethanol_test.xyz
length_unit: angstrom
targets:
energy:
key: energy
unit: eV
forces: on
non_conservative_forces:
key: forces
type:
cartesian:
rank: 1
per_atom: true
Binary file added examples/pet-finetuning/c_ft_res.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
1,100 changes: 1,100 additions & 0 deletions examples/pet-finetuning/data/ethanol.xyz

Large diffs are not rendered by default.

11 changes: 11 additions & 0 deletions examples/pet-finetuning/environment.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
channels:
- metatensor
- conda-forge
dependencies:
- python=3.12
- pip
- pip:
- ase>=3.23
- metatrain[pet]>=2025.8,<2026
- matplotlib
- scikit-learn
6 changes: 6 additions & 0 deletions examples/pet-finetuning/eval_ex1.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
systems:
read_from: data/ethanol_test.xyz
targets:
energy:
key: energy-corrected
unit: eV
12 changes: 12 additions & 0 deletions examples/pet-finetuning/eval_ex2.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
systems:
read_from: data/ethanol_test.xyz
targets:
energy:
key: energy-corrected
unit: eV
non_conservative_forces:
key: forces
type:
cartesian:
rank: 1
per_atom: true
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

Large diffs are not rendered by default.

Binary file added examples/pet-finetuning/nc_learning_res.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
56 changes: 56 additions & 0 deletions examples/pet-finetuning/nc_train_options.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
seed: 42

architecture:
name: pet
training:
batch_size: 16
num_epochs: 10
num_epochs_warmup: 1
learning_rate: 3e-4

training_set:
systems:
read_from: data/ethanol_train.xyz
length_unit: angstrom
targets:
energy:
unit: eV
forces: off
non_conservative_forces:
key: forces
type:
cartesian:
rank: 1
per_atom: true

validation_set:
systems:
read_from: data/ethanol_val.xyz
length_unit: angstrom
targets:
energy:
key: energy
unit: eV
forces: off
non_conservative_forces:
key: forces
type:
cartesian:
rank: 1
per_atom: true

test_set:
systems:
read_from: data/ethanol_test.xyz
length_unit: angstrom
targets:
energy:
key: energy
unit: eV
forces: off
non_conservative_forces:
key: forces
type:
cartesian:
rank: 1
per_atom: true
Loading