Clairio/Security & Infrastructure/Private Deployment
On the Roadmap · Enterprise Data Sovereignty

Your deal intelligence.
Never leaves your cloud.

Clairio runs on Anthropic Claude today — but we know enterprise revenue teams operate under strict data governance requirements. Private deployment on AWS Bedrock and IBM watsonx.ai is on our roadmap, with design partner engagements opening this year. When shipped, your deal signals, opportunity scores, and rep activity data will never reach Anthropic's servers.

Planned — zero data egress to third parties
Planned — SOC 2 compatible architectures
Planned — air-gap deployment (IBM)
Roadmap
Private deployment (AWS Bedrock and IBM watsonx.ai) is not yet available. The options below describe the architecture we're building toward. If you have a specific security, compliance, or air-gap requirement, we'd like to talk — we're actively scoping design partners.
Planned Deployment Options
AWS
Amazon Bedrock
Planned
Clairio on AWS infrastructure

The plan: run Clairio's AI layer entirely within your AWS account using Amazon Bedrock. Your deal signals, opportunity scores, and rep activity data would never leave your VPC, and the Claude model would be served from AWS infrastructure — giving Anthropic zero visibility into your data.

  • VPC PrivateLink endpoint — no public internet routing for data in transit
  • Claude, Llama, Mistral & more available via one unified Bedrock API
  • Deep copy model architecture — model providers cannot access your prompts or completions
  • Native IAM role-based access control and full CloudTrail audit logging
  • Ideal for teams already operating within an AWS security boundary
Claude on BedrockLlama 3MistralVPC PrivateLinkIAM / CloudTrail
Join the waitlist — AWS →
IBM
IBM watsonx.ai
Planned
Clairio on IBM infrastructure

The most private option we're planning. IBM watsonx supports true on-premises deployment via Red Hat OpenShift — meaning Clairio could run entirely within your own data center, with no dependency on any external cloud provider. Built for regulated industries.

  • True air-gap deployment — run fully on-prem via Red Hat OpenShift, no external cloud required
  • IBM Granite, Llama, Mistral, and OpenAI GPT-OSS models available on-premises
  • watsonx.governance layer for explainability, auditability, and AI risk management
  • Hybrid cloud support — start on IBM Cloud, extend to on-prem as compliance requires
  • Ideal for financial services, healthcare, and government revenue teams
IBM Granite 4.0Llama 3Mistral LargeOn-PremisesRed Hat OpenShiftwatsonx.governance
Join the waitlist — IBM →
Side-by-Side Comparison

Where we are, where we're going.

Anthropic API is our current default. AWS Bedrock and IBM watsonx are planned — the cells below describe the architecture those deployments would ship with.

CapabilityAnthropic API — TodayAWS Bedrock — PlannedIBM watsonx — Planned
Data leaves your environmentTo Anthropic serversNo — stays in your AWS accountNo — stays on-prem or IBM Cloud
True on-premises deployment✕ Not available✕ Cloud-only✓ Red Hat OpenShift
Claude model available✓ Latest Claude models✓ Claude via BedrockRoadmap — Granite / Llama today
Audit loggingClairio-level only✓ CloudTrail native✓ watsonx.governance
AI governance & explainabilityPartial (AWS Guardrails)✓ Full watsonx.governance suite
Setup complexityLowest — API key onlyMedium — VPC + IAM configHigher — OpenShift cluster required
Ideal forFast time-to-valueAWS-native enterprise teamsRegulated industries, air-gap requirements
Design Partners · Waitlist

Need private deployment? Let's shape it together.

Private deployment is on our roadmap, not shipped yet. Our team has deep roots in enterprise infrastructure — including IBM cloud and AWS architecture — so if your security or compliance team has specific requirements, we want to hear them now. Design partners get direct input on architecture, timelines, and priority features.

Join the WaitlistBecome a Design Partner