Skip to content

Warning: padding_side using the conversational API. #444

Open
@bitsnaps

Description

@bitsnaps

Hi,

Trying the following code (from the doc):

const hf = new HfInference(HF_TOKEN);

const result = await hf.conversational({
  model: 'microsoft/DialoGPT-large',
  inputs: {
    past_user_inputs: ['Which movie is the best ?'],
    generated_responses: ['It is Die Hard for sure.'],
    text: 'Can you explain why ?'
  },
  parameters: {
    padding_side: 'left'
  }
});

console.log(result);

whereas other examples works fine, this one show us the padding warning:

{
  generated_text: "It's the best movie ever.",
  conversation: {
    generated_responses: [ 'It is Die Hard for sure.', "It's the best movie ever." ],
    past_user_inputs: [ 'Which movie is the best ?', 'Can you explain why ?' ]
  },
  warnings: [
    "A decoder-only architecture is being used, but right-padding was detected! For correct generation results, please set `padding_side='left'` when initializing the tokenizer.",
    'Setting `pad_token_id` to `eos_token_id`:50256 for open-end generation.',
    "The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input's `attention_mask` to obtain reliable results."
  ]
}

I tried the add the following but no luck:

  parameters: {
    padding_side: 'left'
  }

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinginference@huggingface/inference related

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions