The Future
Product Roadmap
Our vision is to become the standard security layer for every AI interaction. Here's how we're getting there.
Q2 2026
In ProgressNative SDKs
Custom SDKs for Node.js, Python, and Go to achieve sub-1ms latency.
Self-Hosted Proxy
Enterprise Docker and K8s images for local deployment within your VPC.
Advanced PII Models
Custom-trained ML models for industry-specific sensitive data detection.
Q3 2026
PlannedAutomated Red Teaming
Continuous testing of your prompts against known LLM jailbreak patterns.
SIEM Integrations
Native connectors for Datadog, Splunk, and Microsoft Sentinel.
RBAC Policy Engine
Granular user permissions and team-based security policies.
Q4 2026
FutureLocal Model Support
Full security pipeline for Ollama, vLLM, and local Llama 3 deployments.
Anomaly Detection
AI-driven detection of unusual prompt patterns and employee behavior.
Compliance Auto-Reporting
One-click generation of SOC2 and HIPAA audit reports.
Missing a feature?
We prioritize our roadmap based on customer needs. Tell us what you need to secure your AI stack.
Suggest a feature