logo

CVE-2024-12704 llama_index

Package

Manager: pip
Name: llama_index
Vulnerable Version: >=0 <0.12.6

Severity

Level: High

CVSS v3.1: CVSS:3.0/AV:N/AC:L/PR:N/UI:N/S:U/C:N/I:N/A:H

CVSS v4.0: CVSS:4.0/AV:N/AC:L/AT:N/PR:N/UI:N/VC:N/VI:N/VA:H/SC:N/SI:N/SA:N

EPSS: 0.00212 pctl0.43801

Details

LlamaIndex Improper Handling of Exceptional Conditions vulnerability A vulnerability in the LangChainLLM class of the run-llama/llama_index repository, version v0.12.5, allows for a Denial of Service (DoS) attack. The stream_complete method executes the llm using a thread and retrieves the result via the get_response_gen method of the StreamingGeneratorCallbackHandler class. If the thread terminates abnormally before the _llm.predict is executed, there is no exception handling for this case, leading to an infinite loop in the get_response_gen function. This can be triggered by providing an input of an incorrect type, causing the thread to terminate and the process to continue running indefinitely.

Metadata

Created: 2025-03-20T12:32:43Z
Modified: 2025-03-21T17:40:52Z
Source: https://github.com/github/advisory-database/blob/main/advisories/github-reviewed/2025/03/GHSA-j3wr-m6xh-64hg/GHSA-j3wr-m6xh-64hg.json
CWE IDs: ["CWE-755"]
Alternative ID: GHSA-j3wr-m6xh-64hg
Finding: F096
Auto approve: 1