Skip to content

Commit b1b08d3

Browse files
authored
Update README.md
1 parent 68f825e commit b1b08d3

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -95,7 +95,7 @@ Our ViT implementations are in `vit`. We provide utility notebooks in the `noteb
9595
* [`dino-attention-maps-video.ipynb`](https://github.com/sayakpaul/probing-vits/blob/main/notebooks/dino-attention-maps-video.ipynb) shows how to generate attention heatmaps from a video. (Visually,) best results were obtained with DINO.
9696
* [`dino-attention-maps.ipynb`](https://github.com/sayakpaul/probing-vits/blob/main/notebooks/dino-attention-maps.ipynb) shows how to generate attention maps from individual attention heads from the final transformer block. (Visually,) best results were obtained with DINO.
9797
* [`load-dino-weights-vitb16.ipynb`](https://github.com/sayakpaul/probing-vits/blob/main/notebooks/load-dino-weights-vitb16.ipynb) shows how to populate the pre-trained DINO parameters into our implementation (only for ViT B-16 but can easily be extended to others).
98-
* [`load-jax-weights-vitb16.ipynb`](https://github.com/sayakpaul/probing-vits/blob/main/notebooks/load-jax-weights-vitb16.ipynb) shows how to populate the pre-trained ViT parameters into our implementation (only for ViT B-16 but can easily be extended to others). .
98+
* [`load-jax-weights-vitb16.ipynb`](https://github.com/sayakpaul/probing-vits/blob/main/notebooks/load-jax-weights-vitb16.ipynb) shows how to populate the pre-trained ViT parameters into our implementation (only for ViT B-16 but can easily be extended to others).
9999
* [`mean-attention-distance-1k.ipynb`](https://github.com/sayakpaul/probing-vits/blob/main/notebooks/mean-attention-distance-1k.ipynb) shows how to plot mean attention distances of different transformer blocks of different ViTs computed over 1000 images.
100100
* [`single-instance-probing.ipynb`](https://github.com/sayakpaul/probing-vits/blob/main/notebooks/single-instance-probing.ipynb) shows how to compute mean attention distance, attention-rollout map for a single prediction instance.
101101
* [`visualizing-linear-projections.ipynb`](https://github.com/sayakpaul/probing-vits/blob/main/notebooks/visualizing-linear-projections.ipynb) shows visualizations of the linear projection filters learned by ViTs.

0 commit comments

Comments
 (0)