This article will help you understand token consumption in AIssistify services
What are tokens in the context of AI language models?
Tokens are the building blocks of text. They represent manageable units such as a word, a character, or even a punctuation mark. For example, in the sentence "I love cats", each word is considered a token. Tokens help AI to understand and process text.
How are tokens calculated and charged in AIssistify services?
In AIssistify, the charge is based on tokens used for generating the response, request or question.
We offer two types of models: Basic (GPT-3.5) and Premium (GPT-4). With GPT-3.5, token consumption is straightforward. For GPT-4, due to its advanced nature, the token consumption is approximately five times higher.
How can I estimate the number of tokens in my text?
On average, 1000 tokens are approximately equivalent to 750 words. However, the actual token count can vary depending on factors like sentence length and complexity of words. For an easy way to calculate tokens, you can use the tool available at: https://platform.openai.com/tokenizer
How does token consumption work for AIssistify AI Image Generation tools?
AIssistify also offers AI Image Generation tools, which consume tokens differently. Every image generated using these tools has a fixed token cost of 500 tokens per image.
Can one estimate the number of tokens per page?
It's challenging to give a direct estimate of tokens per page because token consumption can depend on various factors. A rough estimate could be that 1000 tokens equal to 750 words. However, longer sentences or complex words may use more tokens.