Building a Better Prompt Cost Tool
I’ve always found prompt cost calculators annoying. They’re always split across two websites.
You have to find out the number of tokens for your prompt on one website, then go to another to find out how much those tokens cost. This process is unnecessarily time-consuming and inefficient. It got me thinking: why can’t there be a single tool that calculates both the tokens and cost for you in one place?
So, I built costofprompts.com to simplify this process. You can paste your prompt text, it automatically calculates the tokens and tells you the cost for all well-known models (let me know if you want one added). It’s designed to be user-friendly by providing a simple and clean interface with no distractions.
But there’s more: I added a feature that’s missing from other prompt cost calculators — an estimated output tokens field. For example, if I know my prompt response will roughly be 50 tokens, I can input that number and get a more accurate cost. This feature takes into account the different prices for input and output tokens, something other tools fail to consider.
Tech Behind the Tool
This is built with nextjs, here are some of the decisions I made:
- Token Counting: To calculate the number of tokens in a prompt, I used the
gpt4-tokenizer
package, which estimates token counts based on the GPT-4 tokenization algorithm. - Model Data: I have predefined pricing information for different models (like GPT-4 and o1) that the user can select from. Each model has its own unique input and output token pricing, which is applied to the cost calculation. I use this list of models to define dynamic metadata for the page, so when I add a new model, it automatically updates the page title, description, etc.
Here is more on the dynamic metadata:
import { models } from "./lib/models";
export function generateMetadata() {
const joinedModelNames = modelNames.join(", ");
const modelCosts = modelNames.map((name) => `${name} cost`).join(", ");
return {
title: `LLM Prompt Cost Calculator | Estimate ${joinedModelNames} Costs`,
description: `Calculate the cost of your prompts for popular Large Language Models like ${joinedModelNames}...`,
keywords: `LLM cost calculator, ${modelCosts}, Token count estimator, Prompt cost calculator`,
}
}
Hope this is useful, and feel free to reach out if you have any feedback or want to suggest additional models.