AWS – Count Tokens API supported for Anthropic’s Claude models now in Amazon Bedrock
The Count Tokens API is now available in Amazon Bedrock, enabling you to determine the token count for a given prompt or input being sent to a specific model ID prior to performing any inference.
By surfacing a prompt’s token count, the Count Tokens API allows you to more accurately project your costs, and provides you with greater transparency and control over your AI model usage. It allows you to proactively manage your token limits on Amazon Bedrock, helping to optimize your usage and avoid unexpected throttling. It also helps ensure your workloads fit within a model’s context length limit, allowing for more efficient prompt optimization.
At launch, the Count Tokens API will support Claude models, with the functionality available in all regions where these models are supported. For more information about this new feature, including supported models and use cases, visit the Count Tokens API documentation.
Read More for the details.