When trying to run a newer GPT model, I run into issue that keeps me from proceeding:
litellm.exceptions.BadRequestError: litellm.BadRequestError: OpenAIException - Unsupported parameter: 'stop' is not supported with this model.
To make this work, it's necessary to set up litellm to drop any unsupported parameters. If there is a litellm config file we can set something like litellm.drop_params = True or litellm_settings: drop_params: true