📢  New Launch: LLM Model Directory

Level up your
prompt management

Test, collaborate, version, and deploy prompts, from a single place, with PromptHub.

Press A to request access
Lock icon
GDPR
gray heart icon
Loved by thousands
PromptHub Prompt Management Library Page
Test

The ultimate prompt playground

Put an end to continuous copy and pasting

Utilize variables to simplify prompt creation

A text box in PromptHub with variables and text

Say goodbye to spreadsheets

Easily compare outputs side-by-side when tweaking prompts

Testing prompts side-by-side

Bring your datasets and test prompts at scale with batch testing

Make sure your prompts are consistent by testing with different models, variables, and parameters

A text box in PromptHub with variables and text
"Seeing many outputs with batch testing gives me more confidence in the statistical significance of the changes I make If you're not batch testing your prompts, you're missing out on some low-hanging fruit."
Headshot of Jade Samadi
Jade Samadi
Founder, Smart Recover

The easiest way to test chat prompts

Stream two conversations and test different models, system messages or chat templates

A list of requests in a table

Test across different models

Compare outputs side-by-side

OpenAI logo
OpenAI
green checkmark
Anthropic Logo
Anthropic
Green check mark on light green background
Microsoft logo
Azure
Green check mark on light green background
Google logo
Google
Green check mark on light green background
Meta logo
Meta
Green check mark on light green background
AWS logo
Bedrock
Green check mark on light green background
Mistral logo
Misrtal
Green check mark on light green background
3 cubes stacked
More
Green check mark on light green background

Chain prompts with a few clicks

Take your LLM experiences to the next level by chaining prompts together, no-code needed

Multiple prompts in a chain flow together
collaborate

Collaboration features made for writing prompts

Write better prompts with better versioning

Commit prompts, create branches, and collaborate seamlessly

Version history UI from PromptHub

Automatically track what changed

We detect prompt changes, so you can focus on outputs

Diff checker view of 2 system messages and prompts

Open merge requests for review

Review changes as a team, approve new versions, and keep everyone on the same page

PromptHub interface for analyzing merge requests
Deploy

Seamlessly deploy to all your environments through branches


Run the latest prompt commit from any branch in PromptHub by making a POST request to the https://app.prompthub.us/api/v1/projects/{id}/run?branch=branch%20name endpoint.

Retrieve prompts from any branch, inject variables with your app data, and pass along any metadata.

SAMPLE
{
  "branch"
: 'staging'
  "metadata":
   "user_email": user.email,
 },
 "variables": {
   "type": myVariable,
   "subject": user.subject
 }
}

Retrieve the latest prompt commit from any branch in PromptHub by using a GET request to the https://app.prompthub.us/api/v1/projects/{id}/head?branch=branch%20name endpoint.

Pass the branch name and ensure any environment you have uses the most up to date prompt, without any engineering effort.

SAMPLE
{
  "data": {
      "id": "1951",
      "project_id": "49",
      "branch_id": "235",
      "provider": "OpenAI",
      "model": "gpt-3.5-turbo",
      "prompt": "Write a song",
      "hash": "GzeuWmzN",
      "commit_title": "Updated temperature",
      "prompt_tokens": 135,
      "branch": {
          "name": "Master"
      },
      "configuration": {
          "id": "15",
          "max_tokens": 4096,
          "temperature": 0.5
      }
  }
}

Return a list of all your prompts by making a GET request to the https://app.prompthub.us/api/v1/teams/{id}/projects endpoint.

Filter by group, model, provider and manage your prompts programatically.

SAMPLE
{
  "data":
     {
      "id": "1",
      "name": "LinkedIn Post Generator",
       "description": "Autogenerate posts",
      "head": {
        "id": "1",
         ...
     },

    {
      "id": "2",
      "name": "Feedback Classifier",
       "description": "Classify feedback",
      "head": {
        "id": "2",
         ...
   
  }
}

Deploy your prompts as a shareable form with a few clicks. Control access, embed anywhere, and distribute the power of your prompts.
Chevron pointing right
a wireframe of a form

Deploy your prompts into your workflows through our Zapier integration. Pass variables, log requests, and centralize your prompts.
Chevron pointing right
Zapier logo on a white background with a slight linear gradient

Helping teams get better outputs

A single large gray quotation mark

As a medical doctor and prompt engineering lead, I needed a prompt management tool that would be simple and powerful for my team. PromptHub is the best in the market when it comes to this balance.

Headshot of Erich Hellstrom
Kieran McLeod
Medical Knowledge Lead,
Heidi Health
Chevron pointing right
Monitor

Easily monitor requests, costs, and latencies

See all requests in one place

See how users are interacting with your prompts

A list of requests in a table

Easily re-run and debug requests

🚀 Run in playground

Meaningful insights, automatically

a graph with a linear gradient as the line showing average latency

Professionally built templates, ready for you

🔍
AutoHint
Let the LLM enhance your prompt
🔗
Chain of Density
Generate human-level summaries
🔙
Step-back prompting
Take a step back for better results
💀
Skeleton of thought
Win more RFPs
💼
LinkedIn post generator
Content optimized for LinkedIn
✍️
Product description writer
E-commerce product management
🚀
Product Hunt Tagline
Taglines based on successful launches
💥
High impact keyword analysis
Find your next SEO goldmine
🌴
Tree of thoughts
Traverse many paths for a solution
👥
Multi-persona collaboration
A new prompting method
❤️
EmotionPrompt
Add emotional stimuli -> better outputs
📈
Growth Hacking Techniques
10 growth ideas for your company
🏷️
Meta tags for product pages
Polish up product SEO
📋
User discovery survey
Ask the right questions
📢
Pitch presentation
Deliver a pitch that cuts through noise

Better LLM outputs are a click away

PromptHub is better way to test, manage, and deploy prompts for your AI products