Right now, a DataSet is created only once for each unique combination of metadata and Storage Provider (SP). This makes it hard to detect issues that users face when creating a DataSet with an SP even if uploads to that SP are working fine.
In practice, we’ve seen cases where:
- Piece uploads succeed with an SP, but
- Users fail to create a DataSet with the same SP, blocking them from uploading data at all.
To improve test coverage and catch these situations earlier, we want an environment-controlled mechanism to create new DataSets regularly.
Task
- Introduce an environment variable that configures the frequency at which new DataSets should be created per SP.
- Implement logic to create a fresh DataSet for each SP on every cycle/interval.
- Ensure the new DataSet replaces the previous one for testing purposes without causing race conditions or conflicts.
Notes
- May require a future issue to clean up old or unused DataSets to save wallet funds.
Right now, a DataSet is created only once for each unique combination of metadata and Storage Provider (SP). This makes it hard to detect issues that users face when creating a DataSet with an SP even if uploads to that SP are working fine.
In practice, we’ve seen cases where:
To improve test coverage and catch these situations earlier, we want an environment-controlled mechanism to create new DataSets regularly.
Task
Notes