Open-source LLM security

Stop prompt injection before it reaches your LLM

Parapet is a transparent proxy firewall that scans every request and response for prompt injection, tool abuse, and data exfiltration. Config-driven. Self-hosted. Three lines to integrate.

pip install parapet

LLMs trust everything they read

Your model can't tell the difference between your instructions and an attacker's. Every tool call, every retrieved document, every user message is an attack surface.

0
LLM providers offer deterministic multi-turn prompt injection detection at the API level
5 min
from install to your first blocked attack with Parapet
Peer-reviewed
defense layers grounded in published LLM security research

Layered defense in the request pipeline

Parapet sits between your app and the LLM provider. Every message passes through a stack of security layers before it reaches the model — and again before the response reaches your app.

Define your security policy in YAML

Write a YAML policy, call parapet.init() before your first HTTP client, and every request is scanned.

parapet.yaml yaml
parapet: v1

# Block known injection patterns
block_patterns:
  - "ignore previous instructions"
  - "ignore all previous"
  - "DAN mode enabled"
  - "jailbreak"

# Tool policies: default-deny, allowlist what you need
tools:
  _default:
    allowed: false
  read_file:
    allowed: true
    trust: untrusted
    constraints:
      path:
        not_contains: ["../", "..\\"]
  exec_command:
    allowed: false

# Redact secrets from LLM output
sensitive_patterns:
  - "sk-[a-zA-Z0-9]{20,}"
  - "-----BEGIN.*PRIVATE KEY-----"

Five minutes to your first blocked attack

Step 1

Install

Parapet works with any OpenAI-compatible provider.

pip install parapet
Step 2

Configure

Write a YAML policy. Define block patterns, tool constraints, and sensitive data rules.

cp examples/parapet.yaml .
Step 3

Init

One call before your first HTTP client. Every LLM request is scanned from that point on.

parapet.init("parapet.yaml")

Transparent interception, minimal integration

Your App
Python + httpx
Parapet SDK
Intercept & Scan
Parapet Engine
Rust (fast path)
LLM Provider
OpenAI / Anthropic / etc.

Call parapet.init("parapet.yaml") before creating HTTP clients. The SDK monkey-patches httpx, the Rust engine runs all security layers in microseconds, and every LLM request is scanned transparently.

What Parapet catches

Prompt Injection

Pattern matching against known injection signatures: "ignore previous instructions," jailbreaks, persona hijacks. Scanned after Unicode normalization to defeat encoding tricks.

Tool Abuse

Per-tool constraints on arguments. Block path traversal in file tools, dangerous commands in shell tools, SSRF in web tools. Allowlists and denylists per tool name.

Data Exfiltration

Redact API keys, private keys, and secrets from LLM output. Regex-based pattern matching catches keys even if the model tries to encode or obfuscate them.

Multi-Turn Attacks

Cross-turn risk scoring detects attacks distributed across conversation turns: instruction seeding, role confusion escalation, resampling, and authority claim buildup. Peak + accumulation scoring — no LLM classifier needed.

Canary Tokens

Inject canary strings into system prompts. If they appear in output, your system prompt is leaking. Detect exfiltration attempts that bypass pattern matching.

Built on the literature, not on vibes

Parapet's defense layers are grounded in published academic research on LLM security, prompt injection, and adversarial attacks on language models. Our multi-turn scoring formula — peak + accumulation — achieves 90.8% recall at 1.20% FPR on 10,654 conversations, without invoking an LLM.

Research-Grounded Defense layers informed by the academic literature on prompt injection and LLM adversarial attacks
Open Source Free deterministic layers, self-hosted, no data leaves your infrastructure
Rust Engine Microsecond scanning on the fast path, no latency added to your requests

Your LLM deserves a wall

Parapet is free, open source, and takes five minutes to set up.