Tokens, a term in data science that translates to chunks of words, are the units that vendors use to price their APIs. Different vendors, like OpenAI and Anthropic, use different tokenization methods and charge varying prices per token based on whether it’s an input or output token, or related to the model size.
However, the longer the response you get from an LLM, the higher the token count, according to Wilkinson.
“For interactive conversation ads, advertisers can get charged based on the length of the conversation,” said Wilkinson. “The more words that you share with the LLM, the deeper conversation you’re having. The longer a consumer stays in a conversation, the more information they find out about the product and are more likely to buy.”