Skip to content

ValueError: Input contains infinity or a value too large for dtype('float64'). #163

@xcltql666

Description

@xcltql666

When I run the command
python script/dataset_preprocess_normals/hypersim/preprocess_hypersim_normals.py --split_csv /path/to/metadata_images_split_scene_v1.csv --dataset_dir path_to_downloaded_hypersim --output_dir ${BASE_DATA_DIR}/hypersim
for Hypersim Normals Preprocessing, I encounter the following error:

Traceback (most recent call last):
  File "/home/chengyou/xcl/Marigold/script/normals/dataset_preprocess/hypersim/preprocess_hypersim_normals.py", line 169, in <module>
    surface_to_cam_world_normalized_1d_ = sklearn.preprocessing.normalize(
  File "/home/chengyou/xcl/Marigold/venv/marigold/lib/python3.10/site-packages/sklearn/utils/_param_validation.py", line 218, in wrapper
    return func(*args, **kwargs)
  File "/home/chengyou/xcl/Marigold/venv/marigold/lib/python3.10/site-packages/sklearn/preprocessing/_data.py", line 1979, in normalize
    X = check_array(
  File "/home/chengyou/xcl/Marigold/venv/marigold/lib/python3.10/site-packages/sklearn/utils/validation.py", line 1105, in check_array
    _assert_all_finite(
  File "/home/chengyou/xcl/Marigold/venv/marigold/lib/python3.10/site-packages/sklearn/utils/validation.py", line 120, in _assert_all_finite
    _assert_all_finite_element_wise(
  File "/home/chengyou/xcl/Marigold/venv/marigold/lib/python3.10/site-packages/sklearn/utils/validation.py", line 169, in _assert_all_finite_element_wise
    raise ValueError(msg_err)
ValueError: Input contains infinity or a value too large for dtype('float64').

This occurs, for example, when processing files in ai_013_001/cam_00.

May I ask how to resolve this issue?
Should I skip these erroneous files?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions