Vulnerabilities
Vulnerable Software

Vulnerability Details CVE-2025-48942

vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param kills the vllm server. This vulnerability is similar GHSA-9hcf-v7m4-6m2j/CVE-2025-48943, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue.
Exploit prediction scoring system (EPSS) score
EPSS Score 0.0
EPSS Ranking 14.0%
CVSS Severity
CVSS v3 Score 6.5
Products affected by CVE-2025-48942
  • Vllm » Vllm » Version: 0.8.0
    cpe:2.3:a:vllm:vllm:0.8.0
  • Vllm » Vllm » Version: 0.8.1
    cpe:2.3:a:vllm:vllm:0.8.1
  • Vllm » Vllm » Version: 0.8.2
    cpe:2.3:a:vllm:vllm:0.8.2
  • Vllm » Vllm » Version: 0.8.3
    cpe:2.3:a:vllm:vllm:0.8.3
  • Vllm » Vllm » Version: 0.8.4
    cpe:2.3:a:vllm:vllm:0.8.4
  • Vllm » Vllm » Version: 0.8.5
    cpe:2.3:a:vllm:vllm:0.8.5


Contact Us

Shodan ® - All rights reserved