AWS_BEDROCK_AI_QUERY by providing a Bedrock model ID, a JSON request body, and a LOCATION containing your AWS credentials. Alternatively, you can use AI_QUERY to invoke a model with a simple text prompt, a Bedrock endpoint and a location.
LLM invocations in Firebolt count towards your account’s daily token budget. For details on how to set your token budget and check your current usage, see the sections below: Set your LLM token budget and Check your LLM token quota usage.
LLM token budget accounting is not available in Firebolt Core.
Create a Bedrock LOCATION
Create aLOCATION once and reuse it wherever you need to call Bedrock models.
Authentication options and examples
- Access key and secret
- Temporary credentials (access key, secret, session token)
- IAM role ARN
- IAM role ARN with external ID
LOCATION using an IAM role (with or without an external ID), see Use AWS roles to access Bedrock.
For all options and parameters, see CREATE LOCATION (Amazon Bedrock).
Quick examples
Use these examples to try AI in Firebolt. For the full function reference and more details, seeAWS_BEDROCK_AI_QUERY and AI_QUERY.
Invoke a model
In the examples below,
my_bedrock_location refers to a LOCATION object that you create using one of the methods described above (access keys, temporary credentials, or IAM role).Invoke the LLM on multiple rows
Sentiment analysis
Set or change your LLM token budget
Set your account’s daily LLM token budget to control how many tokens AI functions such asAWS_BEDROCK_AI_QUERY and AI_QUERY can process each day. By default, new accounts have a zero token budget.
ALTER ACCOUNT.
Check your LLM token quota and daily usage
AWS_BEDROCK_AI_QUERY will fail until the limit resets or you increase the budget.
Using LLM functions such as
AWS_BEDROCK_AI_QUERY on large tables or with many rows can quickly exhaust your daily LLM token budget and may result in significant costs in your AWS account. Always review your expected token usage and budget before running large-scale AI queries.