Under Investigation

LiteLLM Supply Chain Security Review

Two malicious versions of LiteLLM (1.82.7 and 1.82.8) were published to PyPI on March 24, 2026. If you installed or updated LiteLLM during the 3-hour exposure window, the unauthorized code may have accessed your SSH keys, cloud credentials, AI API keys, and environment variables. Even if you don't use LiteLLM directly, it may have been installed as a dependency of other AI tools you use.

Advisory Date March 24, 2026
Affected Package litellm (PyPI)
Severity Critical
Status Resolved on PyPI
🚀

Quick Diagnostic Tool Available

Download and run our Python script to automatically check all your Python environments, virtual environments, package caches, and persistence artifacts in one go.

🔍 Open source & read-only — Review the code before running. View source

macOS / Linux curl -O https://pedrorocha-net.github.io/litellm-breach-support/litellm-security-check.py && python litellm-security-check.py
Windows PowerShell Invoke-WebRequest -Uri "https://pedrorocha-net.github.io/litellm-breach-support/litellm-security-check.py" -OutFile "litellm-security-check.py"; python litellm-security-check.py
đŸ“Ĩ Download Script

Affected Versions

1.82.7 1.82.8

Safe Versions

≤ 1.82.6

Impact Summary

Unauthorized versions contained code that accessed system credentials and environment variables. Organizations should review systems and rotate credentials as a precautionary measure.

What Happened

LiteLLM is an open-source library that provides a unified interface for accessing multiple AI/LLM providers. On March 24, 2026, unauthorized versions were published to the Python Package Index (PyPI) containing code not present in the official source repository.

📋 The Entry Point

The incident appears to be linked to a previous security event involving the Trivy vulnerability scanner, which is used in CI/CD pipelines. This resulted in exposed publishing credentials being used to upload unauthorized package versions directly to PyPI.

🔍 Discovery

The community identified the issue when version 1.82.8 was flagged for containing an unusual .pth file. Security researchers and community members promptly reported the issue, leading to rapid response and removal of the affected versions.

⚡ Response Time

The affected versions were available on PyPI for approximately 3 hours. PyPI has quarantined the package and the LiteLLM team has rotated all credentials, engaged security experts (Mandiant), and is conducting a comprehensive supply chain review.

✅ Current Status

The malicious versions have been removed from PyPI. The LiteLLM team has confirmed that Docker images were not affected (they use pinned dependencies). New releases are paused pending a full security review.

â„šī¸ About LiteLLM

LiteLLM serves as middleware between applications and AI service providers (OpenAI, Anthropic, Azure, etc.). By design, it handles API keys for these services. This central role means organizations using LiteLLM should review their API key exposure as part of their response.

Timeline

March 19, 2026

Trivy Security Event

A security incident involving the Trivy vulnerability scanner affects organizations using it in CI/CD pipelines.

March 23, 2026

Checkmarx Investigation

Similar activity detected in Checkmarx tooling. Security community begins tracking related activity.

March 24, 2026 — 10:39 UTC

Unauthorized Version 1.82.7

Package published to PyPI containing code not in the official GitHub repository.

March 24, 2026 — 10:52 UTC

Unauthorized Version 1.82.8

Second version published with additional .pth file mechanism.

March 24, 2026 — ~13:38 UTC

Community Discovery & Response

Issue reported, PyPI quarantines package, affected versions removed. LiteLLM team begins investigation.

Who Should Review Their Systems

Organizations using Python-based AI tooling should check whether they may have installed affected versions, either directly or as a dependency of other tools.

đŸĸ Direct Users

Organizations that have installed or updated LiteLLM on March 24, 2026, particularly during the 10:39-13:38 UTC window, should review their environments.

🔗 Framework Users

Teams using AI frameworks that depend on LiteLLM should verify their dependency versions. Affected frameworks include DSPy, CrewAI, MLflow, and others.

â˜ī¸ Cloud Deployments

Organizations with CI/CD pipelines or container images built during the exposure window should verify their base images don't contain affected versions.

âš ī¸ Understanding Transitive Dependencies

Even if your organization never directly installed LiteLLM, it may have been installed automatically as a dependency of another package. We recommend checking with your engineering teams about any Python-based AI tooling used in your environment.

Potentially Affected Frameworks

Framework Use Case Recommended Action
DSPy LLM programming framework Verify version pinning
CrewAI AI agent orchestration Check dependency tree
MLflow ML model serving Update to patched version
Mem0 Memory for AI agents Verify version pinning
Instructor Structured LLM outputs Check dependency tree
Guardrails AI LLM output validation Update to patched version

What the Unauthorized Code Did

Understanding the technical details helps organizations prioritize their response activities appropriately.

📁 Version 1.82.7

Contained code embedded in proxy_server.py that executed when the LiteLLM proxy module was imported. This version required explicit import to activate.

📁 Version 1.82.8

Added a .pth file, which Python processes automatically on interpreter startup. This meant the code could execute without explicitly importing LiteLLM.

🔑 Accessed Credentials and Data

The unauthorized code was designed to access various credential stores and sensitive files. Organizations should review these categories in their environment:

Cloud Providers AWS, GCP, Azure credentials and config files
SSH Keys Private keys in ~/.ssh/
AI API Keys OpenAI, Anthropic, and other providers
Environment Variables All env vars and .env files
Kubernetes Service account tokens and secrets
Databases Connection strings and passwords
â„šī¸ Exfiltration Destination

Data was transmitted to models.litellm.cloud — this is not an official LiteLLM domain. Organizations may wish to check their network logs for connections to this domain during the exposure period.

Response Guidance

Based on whether your organization was affected, here are the recommended steps to assess and secure your environment.

1

Identify if Affected Versions Were Installed

Check your Python environments for versions 1.82.7 or 1.82.8

â–ŧ

For engineering teams:

# Check installed LiteLLM version pip show litellm | grep Version # Check for .pth files (version 1.82.8) find ~ -name "litellm_init.pth" 2>/dev/null # Search all virtual environments find ~ -name "site-packages" -exec pip list --path {} \; 2>/dev/null | grep -i litellm

For non-technical stakeholders:

  • Ask your engineering team if LiteLLM or AI frameworks (DSPy, CrewAI, MLflow) are used
  • Check if any Python deployments were updated on March 24, 2026
  • Contact vendors of AI-powered tools you use and ask about their LiteLLM usage
2

Check Multiple Projects and Environments

Scan your entire machine for all Python projects, virtual environments, and package caches

â–ŧ
🚀 Automated Option Available

You can use our diagnostic Python script to automatically scan all environments. Download and run: python litellm-security-check.py

If you prefer to check manually, here are commands to scan all your Python environments:

macOS / Linux:

# Find and check all pip-installed packages across the system find ~ -type f -name "METADATA" 2>/dev/null | xargs grep -l "^Name: litellm$" 2>/dev/null # Check all site-packages directories for litellm find ~ -type d -name "site-packages" 2>/dev/null | while read dir; do if [ -d "$dir/litellm" ]; then echo "Found in: $dir" cat "$dir/litellm-*.dist-info/METADATA" 2>/dev/null | grep "^Version:" fi done

Windows PowerShell:

# Find all site-packages directories Get-ChildItem -Path $env:USERPROFILE -Recurse -Directory -Filter "site-packages" -ErrorAction SilentlyContinue | ForEach-Object { $litellmDir = Join-Path $_.FullName "litellm" if (Test-Path $litellmDir) { Write-Host "Found in: $($_.FullName)" $metadata = Get-ChildItem -Path $_.FullName -Filter "litellm-*.dist-info" | Select-Object -First 1 if ($metadata) { Get-Content "$($metadata.FullName)\METADATA" | Select-String "^Version:" } } } # Check all Python installations Get-Command python -All | ForEach-Object { Write-Host "Checking: $($_.Source)" & $_.Source -m pip show litellm 2>$null | Select-String "Version:" }

Check common virtual environment locations (macOS / Linux):

# Common venv/virtualenv locations find ~ -type f -path "*/bin/python" 2>/dev/null | while read py; do echo "Checking: $py" "$py" -m pip show litellm 2>/dev/null | grep "Version:" done # Check Conda environments conda env list | grep -v "^#" | awk '{print $1}' | while read env; do if [ "$env" != "base" ]; then echo "Conda env: $env" conda run -n "$env" pip show litellm 2>/dev/null | grep "Version:" fi done # Check Poetry virtual environments find ~ -type d -name ".venv" -o -name "venv" -o -name ".env" 2>/dev/null | while read venv; do echo "Poetry/Pipenv venv: $venv" "$venv/bin/pip" show litellm 2>/dev/null | grep "Version:" done

Check package caches:

# Pip cache (may contain downloaded malicious versions) pip cache list | grep litellm ls -la ~/Library/Caches/pip/wheels/*/litellm* 2>/dev/null # macOS ls -la ~/.cache/pip/wheels/*/litellm* 2>/dev/null # Linux dir %LOCALAPPDATA%\pip\Cache\wheels\*\litellm* 2>nul # Windows # UV cache ls -la ~/.cache/uv/wheels-v3/*/litellm* 2>/dev/null # Poetry cache ls -la ~/Library/Caches/pypoetry/artifacts/*/litellm* 2>/dev/null # macOS ls -la ~/.cache/pypoetry/artifacts/*/litellm* 2>/dev/null # Linux dir %APPDATA%\pypoetry\artifacts\*\litellm* 2>nul # Windows

One-liner to check everything:

# Quick comprehensive check script echo "=== Checking all Python environments for LiteLLM ===" && \ find ~ -type d -name "site-packages" 2>/dev/null | while read dir; do if [ -f "$dir/litellm/__init__.py" ]; then version=$(cat "$dir"/litellm-*.dist-info/METADATA 2>/dev/null | grep "^Version:" | head -1) echo "[$dir] $version" fi done && \ echo "" && \ echo "=== Checking for malicious .pth files ===" && \ find ~ -name "litellm_init.pth" 2>/dev/null && \ echo "" && \ echo "=== Checking pip cache ===" && \ pip cache list 2>/dev/null | grep litellm || echo "No pip cache or litellm not found"
💡 IDE Virtual Environments

Don't forget IDE-specific Python environments like VS Code's Python extension, PyCharm's virtualenvs, or Jupyter kernel environments. Check locations like: ~/.vscode/, ~/Library/Application Support/Jupyter/

3

Check for Persistence Mechanisms

The unauthorized code may have established persistence on affected systems

â–ŧ

macOS / Linux:

# Check for systemd services systemctl list-units | grep -i sysmon ls -la /etc/systemd/system/sysmon.service 2>/dev/null # Check for persistence files ls -la ~/.config/sysmon/ 2>/dev/null ls -la /tmp/pglog /tmp/.pg_state 2>/dev/null # Kubernetes: check for unauthorized pods kubectl get pods --all-namespaces | grep node-setup-

Windows:

# Check for suspicious scheduled tasks schtasks /query /fo list | findstr /i sysmon # Check startup folder dir "%APPDATA%\Microsoft\Windows\Start Menu\Programs\Startup\" | findstr /i sysmon # Check for persistence files dir "%LOCALAPPDATA%\sysmon\" 2>nul dir "%TEMP%\pglog" 2>nul
4

Rotate Potentially Exposed Credentials

If affected versions were present, rotate credentials as a precautionary measure

â–ŧ
🔄 Recommended Credential Rotation

As a precaution, consider rotating credentials that may have been accessible from affected systems. Priority should be given to credentials with elevated privileges or access to sensitive data.

  • SSH keys (generate new keys, remove old ones from servers and GitHub/GitLab)
  • Cloud provider credentials (AWS access keys, GCP service accounts, Azure service principals)
  • AI service API keys (OpenAI, Anthropic, etc.)
  • Database passwords and connection strings
  • CI/CD pipeline secrets and deployment tokens
  • Kubernetes service account tokens and cluster credentials
  • Docker registry credentials
📊 Review Access Logs

After rotating credentials, review cloud access logs (AWS CloudTrail, GCP Audit Logs, Azure Activity Logs) and Kubernetes audit logs for any unauthorized activity during and after the exposure window.

5

Clean and Verify Affected Systems

Remove unauthorized artifacts and consider rebuilding from clean state

â–ŧ
# Remove unauthorized package pip uninstall litellm # Remove persistence artifacts if found rm -rf ~/.config/sysmon/ rm -f /etc/systemd/system/sysmon.service systemctl daemon-reload # Clear package caches pip cache purge
💡 Consider Rebuilding

For systems confirmed to have run affected versions, the most reliable remediation is rebuilding from a known-clean state rather than attempting in-place cleanup. This ensures complete removal of any potential persistence mechanisms.

6

Implement Supply Chain Protections

Reduce risk of similar incidents in the future

â–ŧ
  • Pin all dependencies to specific versions (avoid package>=1.0, use package==1.0.0)
  • Use lock files (requirements.txt, poetry.lock, pnpm-lock.yaml) and commit them to version control
  • Implement dependency scanning in CI/CD (Snyk, Dependabot, etc.)
  • Consider internal package mirrors for critical dependencies
  • Apply principle of least privilege to CI/CD systems
  • Network segmentation between build systems and production
  • Regular audit of dependency trees for high-risk packages

Official Resources

For the latest updates and detailed technical information, please refer to these authoritative sources.