Tell HN: Litellm 1.82.7 and 1.82.8 on PyPI are compromised
TL;DR Highlight
Malicious .pth files stealing credentials were inserted into LiteLLM PyPI packages versions 1.82.7 and 1.82.8. A supply chain attack that auto-executes on Python interpreter startup — without any import — giving it a wide blast radius.
Who Should Read
Backend developers and MLOps engineers using LiteLLM — specifically anyone who pip-installed litellm in an AI service development environment. Immediate action required.
Core Mechanics
- .pth files in Python's site-packages directory are automatically executed when the Python interpreter starts — no import statement needed. This makes them an unusually dangerous vector for supply chain attacks.
- The malicious code in versions 1.82.7 and 1.82.8 collected environment variables (including API keys, cloud credentials, and database URLs) and exfiltrated them to an external server.
- Any environment where litellm was installed — including Docker containers, virtual environments, and CI/CD pipelines — may have had credentials exfiltrated at every Python process startup.
- The attack was discovered and the malicious versions were yanked from PyPI, but anyone who installed those specific versions between release and yanking is affected.
- Mitigation: immediately upgrade to a clean version, rotate all credentials accessible in environments where 1.82.7 or 1.82.8 was installed.
Evidence
- The security researcher who discovered the attack shared the decompiled malicious code, confirming the .pth execution mechanism and the exfiltration endpoint.
- LiteLLM maintainers published an incident response within hours, confirming the attack, which versions were affected, and recommending immediate upgrade and credential rotation.
- The attack hit particularly hard in AI development environments where litellm is used as a routing layer — these environments typically have credentials for many AI providers (OpenAI, Anthropic, etc.) in their environment variables.
- Commenters raised the broader point: LiteLLM is exactly the kind of high-value target for supply chain attacks — widely used in AI infrastructure, often installed with broad permissions.
How to Apply
- Immediately: pip install --upgrade litellm to get a clean version. Then rotate all credentials that were accessible as environment variables in affected environments.
- Check your pip history or requirements.txt locks to determine if you installed 1.82.7 or 1.82.8. If unclear, treat it as compromised and rotate anyway.
- Add a .pth file scanner to your dependency audit process — tools like pip-audit and safety don't currently detect malicious .pth files, so consider adding a custom check.
- For AI service environments, store sensitive credentials in a secrets manager (AWS Secrets Manager, Vault) rather than environment variables — this limits blast radius from env var exfiltration attacks.
Code Example
snippet
# Script to check for malicious package
pip download litellm==1.82.8 --no-deps -d /tmp/check
python3 -c "
import zipfile, os
whl = '/tmp/check/' + [f for f in os.listdir('/tmp/check') if f.endswith('.whl')][0]
with zipfile.ZipFile(whl) as z:
pth = [n for n in z.namelist() if n.endswith('.pth')]
print('PTH files:', pth) # Should be an empty list if clean
for p in pth:
print(z.read(p)[:300]) # Inspect contents
"
# Pin to a safe version
pip install litellm==1.82.6
# Pin version in requirements.txt
# litellm==1.82.6Terminology
.pth FileA Python path configuration file in site-packages — each line can be an import statement or directory path, auto-executed at interpreter startup.
Supply Chain AttackCompromising software by attacking its distribution channel (PyPI, npm, etc.) rather than the software directly.
Credential ExfiltrationSilently stealing authentication credentials (API keys, tokens) by transmitting them to an attacker-controlled server.
PyPIThe Python Package Index — the official repository for Python packages, installable via pip.