The AI Transformation component applies LLM-powered transformations to fields in your Integrate.io ETL pipeline. It operates per record (one request per configured field/record), supports prompt interpolation, and emits typed outputs you define.
Overview
![thumbnail image]()
LLM API Key Management Options
The AI Transformation component supports two approaches to managing your LLM API key:
1. Bring Your Own LLM Key (BYO Key)
You can supply your own LLM provider API key (for example, an OpenAI key) directly in the Integrate.io platform. All AI Transformation usage will then be billed and processed through your own LLM provider account.
2. Integrate.io Managed Key
You can use an Integrate.io managed key directly from the AI Transformation component. This generates a unique API credential for your Integrate.io account, and all AI Transformation usage will be processed through this Integrate.io managed key, scoped specifically to your account.
Configuration
1. Model Settings
-
Model selection: Choose a Chat Completions model (e.g., gpt-4o, gpt-4o-mini).
- Note: GPT-5 models with reasoning and verbosity capabilities take longer to respond -- and can be verified manually with the completions API
-
Temperature: Controls randomness (0.0–2.0). Param ref: https://platform.openai.com/docs/api-reference/chat/create
-
Max tokens: Upper bound on response tokens. Param ref: https://platform.openai.com/docs/api-reference/chat/create
-
API key:
- Managed Key – Provision a new key via OpenRouter (managed by us)
- To View your Provisioned Key - Go to: Settings > Your Settings > OpenAI API Key
- Bring Your Own Key (BYOK) – Generate and manage your key at platform.openai.com/api-keys
2. Select Fields & Preview
-
Select field name: Pick the upstream column in the field row.
-
Set prompt: Write a concise instruction. Use
#{field_name} to interpolate values. If you do not include any #{...} tokens, the component appends the selected field value automatically so GPT has context.
-
Set alias (projected_name): Provide the output column name for this transformation; aliases are unique by design.
-
Preview: Click Preview to run on a small sample (row-limited to 5 by default). Review outputs and refine prompts/types if needed.
-
Note: All AI prompt fields are returned as string datatype.
Test (Preview)
![thumbnail image]()
You can test your transformation directly in the Package Designer:
- Connect AI Transformation after a source/transform component.
- Configure fields, prompts, and alias.
- Click Preview; a small number of rows(5 by default) will be processed.
- Inspect the outputs and iterate on prompts/types as needed.
The preview table mirrors the schema produced by the connected input component.
Best Practices
- Keep prompts short and explicit; specify the desired output format.
- Use interpolation to add context from multiple fields; create new output fields as needed.
- Use preview to validate behavior and manage token usage.
- Can use ChatGPT to create the prompt for you. E.g. "These are my inputs, this is what I want as my output, write the prompt I should use to achieve this".
Integrate.io Managed Key Costs
When using an Integrate.io Managed Key, AI Transformation usage is billed based on the underlying LLM provider’s consumption pricing.
Integrate.io charges these costs as a direct pass through of the provider’s API usage (for example, token based pricing).
Usage is billed monthly in arrears, meaning charges for a given month’s AI Transformation activity will appear on the following month’s invoice.
You can also configure a cost limit to control or cap spend associated with managed key usage.
Estimating Your Monthly Cost
Because LLM providers charge based on usage, total cost depends on factors such as:
- The number of rows or records processed
- The length of input data sent to the model
- The size of the model’s generated output
- The specific model selected (pricing varies by provider and model)
To estimate expected cost, we recommend:
- Running a small sample transformation first to observe typical usage
- Reviewing the LLM provider’s published token pricing
- Using cost limits to stay within a defined monthly budget
Your Integrate.io usage will scale proportionally with the volume and complexity of the transformations you run.
If you have any further questions or would like to discuss the AI Transformation component in more detail, contact our Support team using our in-app chat or email (support@integrate.io).