Skip to content

Commit da4542a

Browse files
committed
[refactor] Remove resource_manager from TorchSampler._process_requests
Signed-off-by: Robin Kobus <[email protected]>
1 parent 4a8ea56 commit da4542a

File tree

1 file changed

+0
-3
lines changed

1 file changed

+0
-3
lines changed

tensorrt_llm/_torch/pyexecutor/sampler.py

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1071,7 +1071,6 @@ def sample_async(
10711071
model_outputs,
10721072
new_tokens,
10731073
num_context_logits_prefix_sum,
1074-
resource_manager=resource_manager,
10751074
)
10761075

10771076
finish_reasons = self.store.finish_reasons
@@ -1655,8 +1654,6 @@ def _process_requests(
16551654
model_outputs: dict[str, torch.Tensor],
16561655
new_tokens_cuda: torch.Tensor,
16571656
num_context_logits_prefix_sum: list[int],
1658-
*,
1659-
resource_manager: Optional[ResourceManager] = None,
16601657
) -> tuple[list[LlmRequest], torch.Tensor, torch.Tensor]:
16611658
raw_logits_cuda = model_outputs["logits"]
16621659

0 commit comments

Comments
 (0)