Skip to content

What is the memory requirement for the GPU? #25

@starbilibili

Description

@starbilibili

I am conducting chart-to-table testing on the ChartQA dataset, but it shows insufficient GPU memory. I am using a 24G GPU. I have also tried using torch.distributed for distributed training, placing the model on two GPUs for training, but the result is still insufficient GPU memory. I have set the batch_size to 1 and input_size to 224.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions