Pricing

As developers, we understand that you might not have to use the console all the time. Our pricing is based on a pay-as-you-go model, where you only pay for what you use - number of tokens in your prompt.

There are no hidden fees - if you don't use the service, you don't pay anything!

What are tokens?

A token is a chunk of text that can be as short as one character or as long as one word.

How many tokens is one prompt?

It depends on the size of the question that you asked, the context that our CLI has provided and the size of the response that ChatGPT returned. As a rule of thumb, the ChatGPT API often averages around 1 to 1.5 tokens per English word.

$0.003/1K tokens

  • - No minimum usage.
  • - No hidden fees.
  • - No inactivity fees.
  • - Billed once per month.