cf.llm.prompt.injection_score
cf.llm.prompt.injection_score
Number A score from 1–99 that represents the likelihood that the LLM prompt in the request is trying to perform a prompt injection attack.
A low score (for example, below 20
) indicates that there is a high probability that the LLM prompt in the request is trying to perform a prompt injection attack.
The special score 100
indicates that Cloudflare did not score the request.
Requires a Cloudflare Enterprise plan. You must also enable Firewall for AI.
Categories:
- Request
Was this helpful?
- Resources
- API
- New to Cloudflare?
- Directory
- Sponsorships
- Open Source
- Support
- Help Center
- System Status
- Compliance
- GDPR
- Company
- cloudflare.com
- Our team
- Careers
- © 2025 Cloudflare, Inc.
- Privacy Policy
- Terms of Use
- Report Security Issues
- Trademark