Skip to content

Mixed-Precision  #896

Open
Open
@srahmatian

Description

@srahmatian

Hello,
I am using Pytorch3D to create my dataset for training a neural network. As you know the tensors in Pytorch3D have the type of torch.float32. On the other hand, Pytorch_Lightning has the mixed-precision option (16 bit and 32 bit) for training the models, and this feature can extremely increase training speed.
I was wondering if it is possible to take benefit from that feature, when (during training) I am creating the database for each batch by Pytorch3D? Is here any person who had the same experience?

The most important thing is that I don't want to first create the whole of my databases and then train the model. I want to create the dataset (by using DataLoader of the Pytorch_Lightning) during training my model.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or requesthow toHow to use PyTorch3D in my project

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions