Open Source

Ship AI Features with Confidence

Open source version control, testing, and monitoring for your AI prompts. Stop guessing. Start iterating.

Everything you need to manage AI prompts at scale

Version Control

Every change creates an immutable version. Roll back instantly, compare versions, and maintain a complete audit trail of your prompt evolution.

Multi-Provider Support

Switch between OpenAI, Google Gemini, and more with a single click. 100+ models supported. No code changes required.

Real-Time Testing

Test prompts instantly with dynamic variables. See token usage, latency, and responses before publishing to production.

Distributed Tracing

Track every execution with W3C trace context. Debug issues across services with detailed logs, timing, and error tracking.

Dataset Evaluation

Compare prompt performance across test datasets. Run A/B tests with up to 3 prompt versions side-by-side.

API-First Design

Simple REST API with API key authentication. Integrate prompts into any application with a single HTTP call.

From idea to production in minutes

01

Create Your Prompt

Write your system and user messages. Add variables with {{variable}} syntax. Choose your AI provider and model.

02

Test & Iterate

Run tests with sample data. Compare outputs. Refine until perfect. Each save creates a new version you can roll back to.

03

Deploy & Monitor

Publish to production with one click. Monitor executions in real-time. Get alerts on errors and performance issues.

Works with your favorite AI providers

100+ models supported across all major providers

Simple API. Powerful results.

# Execute a prompt with variables
curl -X POST https://your-instance.com/api/v1/projects/my-app/prompts/welcome-message/execute \
  -H "x-api-key: your-api-key" \
  -H "Content-Type: application/json" \
  -d '{
    "variables": {
      "user_name": "Alex",
      "product": "LM SDK"
    }
  }'
// Response
{
  "content": "Welcome to LM SDK, Alex! Let me help you get started...",
  "usage": {
    "input_tokens": 45,
    "output_tokens": 128
  },
  "duration_ms": 892
}

Execute prompts with a single API call. Variables are automatically substituted at runtime.

Quick Start

One-click deployment to Cloudflare Workers

Deploy to Cloudflare

Or clone and run locally

git clone https://github.com/shchahrykovich/lmsdk.git
cd lmsdk
npm install
npm run dev
Self-hosted — Deploy on your own infrastructure with full control
Cloudflare Workers — Built for edge deployment with D1 and R2
React Frontend — Modern UI with React 19 and Tailwind CSS
TypeScript — Full type safety across frontend and backend

Ready to ship AI features faster?

LM SDK is open source and free to use. Star us on GitHub and join the community.

MIT License