Building a Better Prompt Cost Tool

By Charlie Groves

I’ve always found prompt cost calculators annoying. They’re always split across two websites.

You have to find out the number of tokens for your prompt on one website, then go to another to find out how much those tokens cost. This process is unnecessarily time-consuming and inefficient. It got me thinking: why can’t there be a single tool that calculates both the tokens and cost for you in one place?

So, I built costofprompts.com to simplify this process. You can paste your prompt text, it automatically calculates the tokens and tells you the cost for all well-known models (let me know if you want one added). It’s designed to be user-friendly by providing a simple and clean interface with no distractions.

But there’s more: I added a feature that’s missing from other prompt cost calculators — an estimated output tokens field. For example, if I know my prompt response will roughly be 50 tokens, I can input that number and get a more accurate cost. This feature takes into account the different prices for input and output tokens, something other tools fail to consider.

Tech Behind the Tool

This is built with nextjs, here are some of the decisions I made:

Here is more on the dynamic metadata:


import { models } from "./lib/models";

export function generateMetadata() {
  const joinedModelNames = modelNames.join(", ");
  const modelCosts = modelNames.map((name) => `${name} cost`).join(", ");

  return {
    title: `LLM Prompt Cost Calculator | Estimate ${joinedModelNames} Costs`,
    description: `Calculate the cost of your prompts for popular Large Language Models like ${joinedModelNames}...`,
    keywords: `LLM cost calculator, ${modelCosts}, Token count estimator, Prompt cost calculator`,
  }
}

Hope this is useful, and feel free to reach out if you have any feedback or want to suggest additional models.