Skip to content

Conversation

@rahulgurnani
Copy link
Contributor

@rahulgurnani rahulgurnani commented Dec 4, 2025

What type of PR is this?

/kind feature

What this PR does / why we need it:

This PR implements PrepareDataPlugin hook for prefix cache match plugin. The goal is to split the prefix cache match plugin into match plugin and scorer.
This is an intermediate step towards migration of the plugin. In a following PR, scorer would be added and the existing prefix cache match plugin would be deprecated.

Did some initial scale testing using vllm:

On this version

============ Serving Benchmark Result ============
Successful requests:                     1000      
Failed requests:                         0         
Benchmark duration (s):                  83.04     
Total input tokens:                      127000    
Total generated tokens:                  2048000   
Request throughput (req/s):              12.04     
Output token throughput (tok/s):         24663.03  
Peak output token throughput (tok/s):    52286.00  
Peak concurrent requests:                1000.00   
Total Token throughput (tok/s):          26192.43  
---------------Time to First Token----------------
Mean TTFT (ms):                          1875.19   
Median TTFT (ms):                        1877.09   
P99 TTFT (ms):                           2368.76   
-----Time per Output Token (excl. 1st token)------
Mean TPOT (ms):                          31.62     
Median TPOT (ms):                        29.59     
P99 TPOT (ms):                           39.16     
---------------Inter-token Latency----------------
Mean ITL (ms):                           31.62     
Median ITL (ms):                         27.45     
P99 ITL (ms):                            92.53     
==================================================

On head (latest release)

============ Serving Benchmark Result ============
Successful requests:                     1000      
Failed requests:                         0         
Benchmark duration (s):                  101.31    
Total input tokens:                      127000    
Total generated tokens:                  2048000   
Request throughput (req/s):              9.87      
Output token throughput (tok/s):         20215.75  
Peak output token throughput (tok/s):    52972.00  
Peak concurrent requests:                1000.00   
Total Token throughput (tok/s):          21469.36  
---------------Time to First Token----------------
Mean TTFT (ms):                          1885.78   
Median TTFT (ms):                        1851.98   
P99 TTFT (ms):                           2482.30   
-----Time per Output Token (excl. 1st token)------
Mean TPOT (ms):                          33.57     
Median TPOT (ms):                        32.10     
P99 TPOT (ms):                           48.02     
---------------Inter-token Latency----------------
Mean ITL (ms):                           33.57     
Median ITL (ms):                         27.89     
P99 ITL (ms):                            93.49     

@kaushikmitr / @BenjaminBraunDev FYI

Which issue(s) this PR fixes:

Addresses #1924

Does this PR introduce a user-facing change?:

yes. this PR adds a new feature flag "prepareDataPlugins"

@k8s-ci-robot
Copy link
Contributor

Skipping CI for Draft Pull Request.
If you want CI signal for your change, please convert it to an actual PR.
You can still manually trigger a test run with /test all

@k8s-ci-robot k8s-ci-robot added do-not-merge/work-in-progress Indicates that a PR should not merge because it is a work in progress. kind/feature Categorizes issue or PR as related to a new feature. labels Dec 4, 2025
@netlify
Copy link

netlify bot commented Dec 4, 2025

Deploy Preview for gateway-api-inference-extension ready!

Name Link
🔨 Latest commit ab82246
🔍 Latest deploy log https://app.netlify.com/projects/gateway-api-inference-extension/deploys/693a372fb2aa7c0008df86b0
😎 Deploy Preview https://deploy-preview-1942--gateway-api-inference-extension.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@k8s-ci-robot k8s-ci-robot added needs-rebase Indicates a PR cannot be merged because it has merge conflicts with HEAD. size/L Denotes a PR that changes 100-499 lines, ignoring generated files. cncf-cla: yes Indicates the PR's author has signed the CNCF CLA. labels Dec 4, 2025
@rahulgurnani rahulgurnani changed the title Implement PreapreDataPlugin for prefix cache match plugin Implement PrepareDataPlugin for prefix cache match plugin Dec 4, 2025
@k8s-ci-robot k8s-ci-robot removed the needs-rebase Indicates a PR cannot be merged because it has merge conflicts with HEAD. label Dec 4, 2025
PrefixCacheMatchPrecentKey = "PrefixCacheMatchPercentKey"
)

type PrefixCacheMatchPercent struct {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. I suggest making this something more generic like PrefixCacheInfo so we can easily extend. One of things we will likely need to add is the cache tier info.
  2. Instead (or in addition) of providing the percentage, let's provide the match length. The idea is to provide more "raw" data for consumers. Percentage is just something the scorer cares about, the latency predictor may work better with the length.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I renamed it to PrefixCacheInfo to make it easy for extension. Predictor is presently only trained to depend on prefix cache match percent. I think updating this to length may not be suitable in the short term. @BenjaminBraunDev keep me honest here. Thanks!

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think @liu-cong 's opinion carries the most weight here.

The reccomendations here seem reasonable, aggregating data for easier ergonomics of an unproven, experimental plugin, at the expense of overall usability does not seem a worth tradeoff.

I'm not sure why this comment was resolved

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Updated the info to contain prefix cache related info. Thanks

@rahulgurnani rahulgurnani requested a review from liu-cong December 6, 2025 05:16
@rahulgurnani rahulgurnani marked this pull request as ready for review December 6, 2025 05:16
@k8s-ci-robot k8s-ci-robot removed the do-not-merge/work-in-progress Indicates that a PR should not merge because it is a work in progress. label Dec 6, 2025
@rahulgurnani
Copy link
Contributor Author

/assign @kfswain

// If a cycle is detected, it returns an error.
func (c *Config) PrepareDataPluginGraph() error {
func (c *Config) PrepareDataPluginGraph(enablePrepareDataPlugins bool) error {
if !enablePrepareDataPlugins {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Related to the other comment, lets remove this check from here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done, thanks!

return keys
}

func TestPrefixCacheScorer_Score(t *testing.T) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we have a unit test that also creates the legacy prefix scorer, and test to make sure the created scores have identical output? We should be able to validate the behavior is the same in these unit tests

Copy link
Contributor Author

@rahulgurnani rahulgurnani Dec 9, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I removed the scorer from this PR since I am not using it yet. I plan to add the scorer back in a separate PR. Scoping this PR to just adding prepare data step so that predictor can use it. Thanks!

@k8s-ci-robot k8s-ci-robot added the needs-rebase Indicates a PR cannot be merged because it has merge conflicts with HEAD. label Dec 9, 2025
@k8s-ci-robot k8s-ci-robot removed the needs-rebase Indicates a PR cannot be merged because it has merge conflicts with HEAD. label Dec 9, 2025

// PrepareDataPluginGraph creates data dependency graph and sorts the plugins in topological order.
// If a cycle is detected, it returns an error.
func (c *Config) PrepareDataPluginGraph() error {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The DAG should be built, completely agnostic of if any PrepareData plugins are in use. Will add suggestions to help clarify what I mean

Comment on lines 111 to 120
if len(c.prepareDataPlugins) == 0 {
return nil
}
dag := buildDAG(c.prepareDataPlugins)
plugins, err := sortPlugins(dag, c.prepareDataPlugins)
if err != nil {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if len(c.prepareDataPlugins) == 0 {
return nil
}
dag := buildDAG(c.prepareDataPlugins)
plugins, err := sortPlugins(dag, c.prepareDataPlugins)
if err != nil {
dag := buildDAG(c.allPlugins)
plugins, err := sortPlugins(dag, c.allPlugins)
if err != nil {

This is semi psuedocode. But I hope this kind of illustrates what I mean. We should always create the DAG, entirely agnostic of which plugins are in use.

Copy link
Contributor Author

@rahulgurnani rahulgurnani Dec 10, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ohh. I understand. Added a TODO for it and address it as a fast follow in the next PR. Let me know if that's reasonable. Thanks!

@rahulgurnani
Copy link
Contributor Author

/retest

@rahulgurnani rahulgurnani force-pushed the pc-m branch 3 times, most recently from 672ca5e to 1a0c28c Compare December 11, 2025 03:10
@kfswain
Copy link
Collaborator

kfswain commented Dec 11, 2025

/lgtm
/approve

@k8s-ci-robot k8s-ci-robot added the lgtm "Looks good to me", indicates that a PR is ready to be merged. label Dec 11, 2025
@k8s-ci-robot
Copy link
Contributor

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: kfswain, rahulgurnani

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@k8s-ci-robot k8s-ci-robot added the approved Indicates a PR has been approved by an approver from all required OWNERS files. label Dec 11, 2025
@k8s-ci-robot k8s-ci-robot merged commit db2b7ce into kubernetes-sigs:main Dec 11, 2025
12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

approved Indicates a PR has been approved by an approver from all required OWNERS files. cncf-cla: yes Indicates the PR's author has signed the CNCF CLA. kind/feature Categorizes issue or PR as related to a new feature. lgtm "Looks good to me", indicates that a PR is ready to be merged. size/L Denotes a PR that changes 100-499 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants