Skip to content

GPU not being used #63

@valentinps

Description

@valentinps

Hi, thank you for making such an interesting model !

I am trying to run the inference script, and I have the exact described environment from the readme.
I am using a 3090 GPU and for some reason everytime I try to run the inference the GPU fills up (20GB/24GB) using ViewCrafter_16,
However the gpu usage stays at 0 the whole time, with some debugging I noticed the program was running but extremely slowly so it likely is running on CPU.

I just don't understand why it does since the GPU fills up, the selected device is the right one and cuda is avilable to torch...

I hope someone will be able to help !

Image

Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions