npx skills add opsyhq/opsyREADME
Opsy
CLI DevOps Agent with guardrails.
AI-powered infrastructure management from the command line. Opsy understands your AWS, Terraform, and Kubernetes context - and won't destroy prod.

Installation
curl -fsSL https://opsy.sh/install.sh | bash
Quick Start
opsy
Connecting to an LLM provider
Supports 75+ LLM providers. connects with existing Claude or OpenAI subscriptions.
/connect
Selecting a cloud context
/aws
Asking a question
why can't i reach my load balancer?
Features
- Full Plan Visibility - See exactly what will happen before execution
- Context Aware - Auto-detects AWS account, region, Terraform workspace
- Danger Classification - Color-coded commands (read, update, delete, destroy)
- Live Streaming - Real-time output, no truncation
- Audit Logging - Full trail of every operation
- Secret Redaction - Auto-detects and redacts API keys, tokens, passwords
- Run Recording - Shareable recordings for review
Documentation
| Tool | Guide |
|---|---|
| AWS | docs/aws.md - Profile/region context |
| Terraform | docs/terraform.md - Plan-before-apply, state backups |
| Kubernetes | docs/kubernetes.md - Context injection |
| Helm | docs/helm.md - Chart operations |
| GitHub | docs/github.md - Repo context |
| Vercel | docs/vercel.md - Team/project context |
Examples
| Type | Description |
|---|---|
| Configs | Ready-to-use opsy configurations |
| Runbooks | Step-by-step guides opsy can follow |
| AGENTS.md | Project templates for Terraform, K8s, monorepos |
| Demos | Terraform demo projects and recordings |
Skills
Skills teach opsy best practices for each tool. Add to ~/.opsy/opsy.jsonc:
{
"instructions": [
"https://raw.githubusercontent.com/opsyhq/opsy-cli/main/skills/aws-wtf/SKILL.md",
"https://raw.githubusercontent.com/opsyhq/opsy-cli/main/skills/aws-finops/SKILL.md"
]
}
Or use npx to add skills:
npx add-skill opsyhq/opsy --skill aws-wtf # AWS bill breakdown
npx add-skill opsyhq/opsy --skill aws-finops # Cost optimization
See skills/ for all available skills.
Supported Providers
Opsy supports multiple LLM providers:
- Anthropic (Claude)
- OpenAI
- Azure OpenAI
- Google Vertex AI
- Amazon Bedrock
- Groq, Mistral, Cohere, and more
Acknowledgements
Opsy is built on OpenCode, an open-source AI coding agent. Huge thanks to the OpenCode team for their work.
Publisher
Statistics
Stars12
Forks0
Open Issues0
CreatedJan 19, 2026