PromptodexPromptodex
Back to Blog

Prompt Hosting with promptodex: Ship Better Prompts Without Redeploying

Discover how the promptodex npm module lets you host, version, and update AI prompts at runtime—without touching your deployment pipeline.

Promptodex Team

Hardcoding prompts into your application works fine—until it doesn't.

You've shipped a feature that uses a carefully crafted prompt. Users are happy. But then you notice the LLM occasionally misinterprets the instructions. A quick tweak to the prompt would fix it, but there's a problem: the prompt lives in your codebase. Changing it means a new commit, a new build, a new deployment, and potentially a full rollout process.

For a one-line change to a string.

Today, we're introducing promptodex, a tiny npm module that lets you fetch, render, and manage prompts at runtime—completely decoupled from your deployment lifecycle.

The Runtime Prompt Revolution

npm install promptodex

The promptodex module does one thing extremely well: it fetches prompts from the Promptodex registry and renders them with your variables. That's it. No AI execution, no vendor lock-in, no bloated dependencies.

import { pod } from "promptodex";

const prompt = await pod("customer-greeting", {
  customerName: "Alex",
  context: "returning customer"
});

// Send to your preferred AI provider
const response = await openai.chat.completions.create({
  model: "gpt-4.1",
  messages: [{ role: "user", content: prompt }]
});

The prompt customer-greeting lives on Promptodex. When you update it on the platform, every application using it gets the new version—instantly, without any code changes.

Three Problems This Solves

1. Decoupling Prompts from Deployments

The traditional workflow looks like this:

  1. Write prompts in your codebase
  2. Deploy application
  3. Realize prompt needs tweaking
  4. Make code change
  5. Wait for CI/CD
  6. Deploy again
  7. Repeat

With promptodex, you skip steps 4-6 entirely. Your prompts live in Promptodex, where you (or your prompt engineers) can iterate on them independently. The application fetches the latest version at runtime.

This is especially powerful for teams where prompt engineering and application development are separate concerns. Your ML team can refine prompts while your backend team focuses on infrastructure.

2. Sharing Prompt Fragments Across Services

Modern architectures often involve multiple services that need similar AI capabilities. Maybe your web app, mobile backend, and API gateway all use the same summarization prompt.

Instead of copying that prompt into three codebases (and keeping them in sync), create it once on Promptodex:

// web-service/src/features/summarize.ts
const prompt = await pod("company/summarize-text", {
  content: articleText,
  maxLength: "200 words"
});

// mobile-backend/src/handlers/summarize.ts
const prompt = await pod("company/summarize-text", {
  content: userInput,
  maxLength: "100 words"
});

// api-gateway/src/endpoints/summarize.ts
const prompt = await pod("company/summarize-text", {
  content: payload.text,
  maxLength: payload.maxWords + " words"
});

Same prompt slug, same source of truth, potentially different variable values. Update the prompt once, and all three services benefit.

3. A Central Repository of Prompt Patterns

Every team eventually builds a collection of prompts that work well for their domain. Customer service scripts, code review templates, data extraction patterns—these become organizational knowledge.

Promptodex becomes that central repository:

  • Version Control: Every edit creates a new revision with an optional message, just like Git commits
  • Forking: Found a community prompt that's close to what you need? Fork it and customize
  • Discovery: Browse public prompts for inspiration or proven patterns
  • Private Prompts: Keep proprietary prompts secure with API key authentication
// Access a private team prompt
const prompt = await pod(
  "internal/sales-email-template",
  { lead: customerData },
  { apiKey: process.env.PROMPTODEX_API_KEY }
);

Versioning: When You Need Stability

Sometimes you don't want the latest version. Maybe you're running A/B tests, or you need to ensure a critical workflow uses a tested prompt.

Pin to a specific version with the @ syntax:

// Always fetch version 3, even if newer versions exist
const prompt = await pod("code-review@3", {
  code: pullRequestDiff,
  language: "TypeScript"
});

This gives you the best of both worlds:

  • Development: Iterate rapidly on the latest version
  • Production: Pin to tested versions for stability
  • Rollback: Switch back to a previous version instantly

The API at a Glance

The module exports three functions:

pod(slug, variables?, options?)

The all-in-one function. Fetches and renders a prompt in a single call.

const prompt = await pod("greeting", { name: "World" });
// "Hello World, welcome to Promptodex!"

fetchPrompt(slug, options?)

Just fetch the raw template without rendering. Useful when you need the template content itself.

const { content } = await fetchPrompt("greeting");
// "Hello {{name}}, welcome to Promptodex!"

renderPrompt(template, variables?)

Client-side rendering. Already have the template? Render it locally without a network call.

const rendered = renderPrompt(
  "Summarize this: {{content}}",
  { content: "Long article text..." }
);

Tiny Footprint, Zero Dependencies

We believe tools should be simple. The promptodex module is under 200 lines of code with zero dependencies. It uses the native fetch API (Node.js 18+), which means:

  • No axios, no node-fetch, no polyfills
  • Minimal bundle size
  • No transitive dependency security concerns
  • TypeScript types included

Getting Started

  1. Install the module:

    npm install promptodex
    
  2. Find or create a prompt on Promptodex

  3. Fetch and use it:

    import { pod } from "promptodex";
    
    const prompt = await pod("your-prompt-slug", {
      variable1: "value1"
    });
    
  4. Send to your AI provider (OpenAI, Anthropic, Google, etc.)

What's Next?

The promptodex module is just the beginning. We're building an ecosystem where prompts are first-class citizens:

  • pod-cli: Run prompts directly from your terminal
  • Version diffs: Visualize changes between prompt versions
  • Analytics: Track which prompts are being used and how

Prompts are becoming the new configuration layer for AI applications. With promptodex, you can manage them with the flexibility they deserve.


Ready to try it? Install promptodex and start hosting your prompts today.

npm install promptodex