Use the Mouseflow source component to read website analytics, session recordings, funnels, forms, and feedback from your Mouseflow account into your Integrate.io ETL pipeline.Documentation Index
Fetch the complete documentation index at: https://www.integrate.io/docs/llms.txt
Use this file to discover all available pages before exploring further.
Connection
Select an existing Mouseflow connection or create a new one. A Mouseflow connection requires:- Email. The email address tied to your Mouseflow account.
- API Key. The API key generated in your Mouseflow account settings.
- Website ID (optional). A default website to query when a source component does not specify its own.
Source Properties
The source component is configured in Step 02 of the component editor.Website
The Website picker selects which Mouseflow website the source reads from. The picker loads the list of websites available to the connection’s API credentials.- The Website picker is shown when the connection has no default Website ID set. In that case, every source component must pick its own website.
- When the connection has a default Website ID, the picker is hidden and the connection-level value is used automatically. To read from a different website, create a separate connection without a default Website ID.
Source Table (Object)
Select the Mouseflow object to read data from:| Object | Description | Website required |
|---|---|---|
| websites | All websites available to the connection’s API credentials. Useful for discovering the Website IDs to use elsewhere. | No |
| recordings | Session recordings captured for the selected website, including visitor metadata, duration, and page counts. | Yes |
| funnels | Conversion funnels configured for the selected website, including step definitions and conversion stats. | Yes |
| forms | Form analytics for the selected website, including drop-off and completion data per field. | Yes |
| feedback | Feedback campaign responses collected from visitors on the selected website. | Yes |
Load Type
Select how records are loaded on each pipeline run:- Full Load. Fetches all records for the selected object on every run.
- Incremental Load. Fetches only records created or updated after a reference date. Use this for scheduled pipelines to avoid re-processing historical data.
Incremental Load Settings
When Incremental Load is selected, the following options appear: Sync date field. The date field used to filter records (for example,created_at or updated_at).
Load records. The filter condition:
newer than: Fetch records after the reference date.older than: Fetch records before the reference date.
- Fixed Date. Pick a specific calendar date.
- Variable. Use a system or custom variable. The recommended value for scheduled pipelines is
$package_last_successful_job_submission_timestamp, which advances the start date after each successful run.