Skip to content

[Benchmark] Add support for RefCOCO-M benchmark#1524

Open
rshube wants to merge 1 commit intoopen-compass:mainfrom
rshube:benchmark/add_refcoco_m_support
Open

[Benchmark] Add support for RefCOCO-M benchmark#1524
rshube wants to merge 1 commit intoopen-compass:mainfrom
rshube:benchmark/add_refcoco_m_support

Conversation

@rshube
Copy link
Copy Markdown

@rshube rshube commented Apr 22, 2026

Summary

This PR adds support for the RefCOCO-M benchmark in VLMEvalKit.

Changes

  • adds a dedicated RefCOCOMDataset
  • registers RefCOCO-M in the dataset registry
  • keeps the benchmark separate instead of treating it as an alias of RefCOCO

Dataset format

Expected TSV columns:

  • index
  • image
  • question
  • answer
  • bbox_x1
  • bbox_y1
  • bbox_x2
  • bbox_y2
  • width
  • height
  • category

Evaluation

This benchmark reuses the existing RefCOCO-style grounding evaluation path.

Notes

  • this PR only adds benchmark support code
  • hosted DATASET_URL / DATASET_MD5 can be filled once the final TSV hosting location is settled (sending email to maintainers)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant