CVE-2024-8939 – vllm
Package
Manager: pip
Name: vllm
Vulnerable Version: >=0 <=0.5.0.post1
Severity
Level: Medium
CVSS v3.1: CVSS:3.1/AV:L/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H
CVSS v4.0: CVSS:4.0/AV:L/AC:L/AT:N/PR:N/UI:N/VC:N/VI:N/VA:H/SC:N/SI:N/SA:N
EPSS: 0.00016 pctl0.0245
Details
vLLM Denial of Service via the best_of parameter A vulnerability was found in the ilab model serve component, where improper handling of the best_of parameter in the vllm JSON web API can lead to a Denial of Service (DoS). The API used for LLM-based sentence or chat completion accepts a best_of parameter to return the best completion from several options. When this parameter is set to a large value, the API does not handle timeouts or resource exhaustion properly, allowing an attacker to cause a DoS by consuming excessive system resources. This leads to the API becoming unresponsive, preventing legitimate users from accessing the service.
Metadata
Created: 2024-09-17T18:33:26Z
Modified: 2024-09-17T21:56:55Z
Source: https://github.com/github/advisory-database/blob/main/advisories/github-reviewed/2024/09/GHSA-wc36-9694-f9rf/GHSA-wc36-9694-f9rf.json
CWE IDs: ["CWE-400"]
Alternative ID: GHSA-wc36-9694-f9rf
Finding: F002
Auto approve: 1