Skip to the content

Automate Everything !

🤖 Explore with AI: ChatGPT Perplexity Claude Google AI Grok

For Enterprises | Teams | Start-Ups

eZintegrations

eZintegrations – AI Workflows & AI Agents Automation Hub

Automate to Innovate

0
$0.00
eZintegrations

eZintegrations – AI Workflows & AI Agents Automation Hub

Automate to Innovate

Menu
0
$0.00
  • Categories
    • Workflow Automation
    • AI Workflow
    • AI Agent
    • Agentic AI
  • Home
  • Automate Now !
  • About Us
  • Contact
  • Blog
  • Pricing
  • Free AI Workflow
  • Free AI Agents

eZintegrations

  • eZintegrations Introduction
  • Integration Bridge
    • Rename Integration Bridge
    • Enable and Disable Integration Bridge
    • Integration Bridge Save
    • Integration Bridge Run Once
    • Clear Logs of An Integration Bridge
    • Integration Bridge Share Feature
    • Copy Operation
    • Integration Bridge Import/Export
    • Integration Bridge Auto Save Feature
    • View An Integration Bridge
    • Copy Integration Bridge
    • Streaming Logs of Integration Bridge
    • Download Logs of An Integration Bridge
    • Status of Integration Bridge
    • Refresh an Integration Bridge
    • Stop An Integration Bridge
    • Start An Integration Bridge
    • Frequency
  • Feedback
    • Feedback: Tell Us What You Think
  • Understanding Session Timeout
    • Understanding Session Timeout and the Idle Countdown Timer
  • Alerts
    • Alerts
  • Marketplace
    • Marketplace
  • DIY Articles
    • 60+ Transformations for Smarter Data: How eZintegrations Powers Operations
    • From SOAP to GraphQL: Modernizing Integrations with eZintegrations
    • Accelerate Growth with eZintegrations Unified API Marketplace
    • Collaborative Integrations: Sharing Bridges in eZintegrations to Foster Cross-Team Innovation
    • Unlocking Hidden Value in Unstructured Data: eZintegrations AI Document Magic for Strategic Insights
    • Workflow Cloning Wizardry: Replicating Success with eZintegrations Integration Duplication for Rapid Scaling
    • Time Zone Triumph: Global Scheduling in eZintegrations for Synchronized Cross-Border Operations
    • Parallel Processing Power: eZintegrations Multi-Threaded Workflows for Lightning Fast Data Syncs
    • From Data Chaos to Competitive Edge: How eZintegrations AI Syncs Silos and Boosts ROI by 40%
    • From Emails to Insights: eZintegrations AI Turns Chaos into Opportunity
    • Handling XML Responses in eZintegrations
    • Text to Action: Shape Data with Plain English or Python in eZintegrations
    • AI Magic: Send Data to Any Database with a Simple English Prompt in eZintegrations
    • Configuring Netsuite as Source
    • Configuring Salesforce as Source
    • Overcoming Upsert Limitations: A Case Study on Enabling Upsert Operations in APIs without Inherent Support
    • Connecting QuickBooks to Datalake
    • Connecting Salesforce to Netsuite
    • Connecting My-SQL to Salesforce Using Bizdata Universal API
    • Effortless Integration Scheduling: Mastering Biweekly Execution with eZintegrations
    • Connecting MS-SQL or Oracle Database to Salesforce Using Bizdata Universal API
    • Establishing Token-Based Authentication within NetSuite
    • Registering a Salesforce App and Obtaining Client ID / Secret (for API Calls / OAuth)
  • Management
    • Adding Users and Granting Organization Admin Privileges : Step-by-Step Guide
    • Security Matrix
    • Adding Users as an Organization Admin (Step-by-Step Guide)
  • Appendix
    • Pivot Operation Use Cases
    • Efficient Column Renaming in eZintegration Using Python Operation
    • Filter Operation Use Cases
    • Connecting any Database to Database
    • Connecting Data Targets
    • Connecting Data Sources
  • Release Notes
    • Release Notes
  • Accounting & Billing
    • Invoices
    • Billing Information
    • Payment Method
    • Current Plan
    • Plans
    • Dashboard
  • My Profile
    • My Profile
  • OnBoarding
    • Microsoft Login
    • Multi-Factor Authentication
    • Login for New Users
  • Pycode Examples
    • Extract Domain Name from Email using Split
    • Split String with Regular Expression
    • Bulk Rename of Keys
    • Form a JSON Object from array of array
    • URL Parsing
    • Form a JSON Object based on the key and values available in JSON Dataset
    • Convert Empty String in a JSON to a “null” value
    • Generate a OAuth 1.0 Signature or Store a Code Response in a User Defined Variable
    • Rename JSON Key based on other key’s value
  • Sprintf
    • Sprintf
  • Data Source Management
    • Data Source Management
  • Data Source API
    • Response Parameters: Text, XML, and JSON Formats
    • Environment Settings for Reusable and Dynamic Configuration
    • API Numeric Parameters for Pagination and Record Limits
    • API Time Parameters for Date and Time Filtering
    • How to test the Data Source API
    • Pre- Request Scripts
      • Pre- Request Scripts for Amazon S3
      • Pre- Request Scripts for Oracle Netsuite
      • Pre-Request Script for Amazon SP API
      • Pre-Request Scripts
    • API Pagination Methods
      • Custom Pagination
      • Encoded Next Token Pagination
      • Cursor Pagination
      • Pagination with Body
      • Total Page Count Pagination
      • Offset Pagination
      • Next URL Pagination
      • API Pagination Introduction
      • Pagination examples
        • SAP Shipment API Pagination
        • Amazon SP API Pagination
    • API Authorization
      • OAuth 2.0 Authorization
      • OAuth 1.0 Authorization
      • Basic Authentication Method
      • API Key Authorization Method
      • Different Types of API Authorization
  • Console
    • Console: Check Your Data at Every Step
  • eZintegrations Dashboard Overview
    • eZintegrations Dashboard Overview
  • Monitoring Dashboard
    • Monitoring Dashboard
  • Advanced Settings
    • Advanced Settings
  • Summary
    • Summary
  • Data Target- Email
    • Data Target- Email
  • Data Target- Bizintel360 Datalake Ingestion
    • Data Target- Goldfinch Analytics Datalake Ingestion
  • Data Target- Database
    • Data Target – Database SQL Examples
    • Database as a Data Target
  • Data Target API
    • Response Parameters
    • REST API Target
    • Pre-Request Script
    • Test the Data Target
  • Bizdata Dataset
    • Bizdata Dataset Response
  • Data Source- Email
    • Extract Data from Emails
  • Data Source- Websocket
    • WebSocket Data Source Overview
  • Data Source Bizdata Data Lake
    • How to Connect Data Lake as Source
  • Data Source Database
    • How to connect Data Source Database
  • Data Operations
    • Deep Learning
    • Data Orchestration
    • Data Pipeline Controls
    • Data Cleaning
    • Data Wrangling
    • Data Transformation

Goldfinch AI

  • Goldfinch AI Introduction

Bizdata API

  • Universal API for Database
    • API for PostgreSQL Database – Universal API
    • API for Amazon Aurora Database (MySQL/Maria) – Universal API
    • API for Amazon Redshift Database – Universal API
    • API for Snowflake Database – Universal API
    • API for MySQL/Maria Database – Universal API
    • API for MS-SQL Database-Universal API
    • API for Oracle Database- Universal API
    • Introduction to Universal API for Databases
  • SFTP API
    • SFTP API
  • Document Understanding APIs
    • Document Understanding API- Extract data from Documents
  • Web Crawler API
    • Web Crawler API – Fast Website Scraping
  • AI Workflow Testing APIs
    • Netsuite Source Testing API (Netsuite API Replica)
    • Salesforce Testing API (Salesforce API replica)
    • OAuth2.0 Testing API 
    • Basic Auth Testing API 
    • No Auth Testing API
    • Pagination with Body Testing API
    • Next URL Pagination Testing API 
    • Total Page Count Pagination Testing API
    • Cursor Pagination Testing API 
    • Offset Pagination Testing API
  • Import IB API
    • Import Integration service with .JSON file
  • Linux File & Folder Monitoring APIs
    • Monitor Linux Files & Folder using APIs
  • Webhook
    • Webhook Integration-Capture Events in Real Time
  • Websocket
    • Websocket Integration- Fetch Real Time Data
  • Image Understanding
    • Image Understanding API – Extract data from Images

Goldfinch Analytics

  • Visualization Login
    • Enabling Two Factor Authentication
    • Visualization login for analytics users
  • Profile
    • Profile
  • Datalake
    • Datalake
  • Discover
    • Discover
  • Widgets
    • Filter
    • Widget List
    • Widgets Guide
    • Creating Widgets & Adding Widgets to Dashboard
  • Dashboard
    • Dashboard
  • Views
    • Views
  • Filter Queries
    • Filter Queries for Reports and Dashboard
  • Alerts
    • Alerts
  • Management
    • Management
  • Downloading Reports with Filtered Data
    • Downloading Reports with Filtered Data in Goldfinch Analytics
  • Downloads
    • Downloads – eZintegrations Documents & Resources | Official Guides & Manuals
View Categories

Connecting QuickBooks to Datalake

User Guide: Configuring Source as QuickBooks and Target as Bizintel360 Datalake

Overview

This guide provides a step-by-step walkthrough for configuring QuickBooks as the data source and Bizintel360 Datalake as the target. It explains how to establish a secure connection, retrieve payment-related data, and transmit the processed information to the Datalake.

The integration uses the access token API to authenticate with QuickBooks and retrieve data from the Payment business object. Depending on user preferences, alternative business objects and OAuth 2.0 authentication can also be configured.

When to Use

This configuration is suitable for organizations that need to ingest QuickBooks payment data into Bizintel360 Datalake for analytics, reporting, or archival purposes.

  • When importing QuickBooks payment data into a centralized data platform
  • When using API-based authentication with access tokens
  • When structured JSON output is required for storage
  • When automated data ingestion is needed

How It Works

The integration follows a structured pipeline to retrieve and transform data from QuickBooks before loading it into the Datalake.

  • Generate an access token using the OAuth token API
  • Extract the access token from the response
  • Call the QuickBooks Payment API using the token
  • Convert XML responses to JSON
  • Filter and format records
  • Load processed data into the Datalake

Each operation prepares the data for the next stage, ensuring reliable and structured ingestion.

How to Configure

Data Source Configuration

Step 1: Select Source Type

On the Data Source page, select the source type as API.

Step 2: Create Product

In the Product Name field, select Create New.

Step 3: Provide Product Details

Enter the Product Name and Business Object according to user requirements.

Note: Selecting “Create New” requires providing all necessary configuration details for the API source.

Step 4: Configure Access Token API

To generate the access token, configure the token API with the required headers and body.

  • Method: POST
  • API Endpoint URL: https://oauth.platform.intuit.com/oauth2/v1/tokens/bearer
  • Headers: {“Content-Type”:”application/x-www-form-urlencoded”}

Body parameters:

client_id={{client_id}}&
client_secret={{client_secret}}&
scopes={{scopes}}&
grant_type=refresh_token&
refresh_token={{refresh_token}}

Select the Body type as Text, then click Test to verify access token generation.

Data Operations

Data Extractor

The Data Extractor operation retrieves specific fields from the API response. In this configuration, it is used to extract the access token while excluding other response keys.

Reference Link: Data Extractor

API Operation

The API operation retrieves Payment business object data using the generated access token.

System Tab

In the System tab:

  • Select Create New as Product Name
  • Enter Product Name and Business Object
  • Configure for Payment data retrieval

Test Tab

In the Test tab, configure the following:

  • Method: GET
  • API Endpoint URL: Refer to QuickBooks API Documentation
  • Params: {“minorversion”:”{{minorversion}}”}
  • Headers: {“Authorization”:”Bearer {%access_token%}”}

The minorversion parameter enables access to specific API versions.

Note: sprintf is used in headers to pass the access token.

Reference Link: API

XML to JSON

The XML to JSON operation converts XML responses into JSON format. This conversion enables structured insertion into the Datalake.

The API response is stored in the bizdata_dataset_response dataset.

Reference Link: XML to JSON

Eliminate

The Eliminate operation removes unnecessary keys from the dataset before ingestion into the Datalake.

Reference Link: Eliminate

Single-Line to Multi-Line

This operation converts single-line JSON responses into multiline format, separating records for easier processing.

It improves readability and enables efficient application of subsequent operations.

Reference Link: Single-Line to Multi-Line

Data Target Configuration

Step 7: Configure Datalake Target

Configure the Data Target as Bizintel360 Datalake by following the instructions provided in the Datalake ingestion guide.

Link: Datalake Ingestion

Troubleshooting

  • Access Token Not Generated: Verify client credentials and refresh token.
  • Authorization Errors: Confirm access token is passed correctly in headers.
  • Missing Data: Check business object and extractor configuration.
  • Invalid JSON Output: Validate XML to JSON conversion settings.
  • Datalake Load Failure: Review target configuration and permissions.

Frequently Asked Questions

What authentication method is used in this configuration?

This configuration uses the OAuth token API to generate an access token, which is used for API authentication.

Can I use a different business object instead of Payment?

Yes. Users may configure alternative business objects based on their data requirements.

Why is XML to JSON conversion required?

The QuickBooks API returns XML responses, which must be converted to JSON for Datalake ingestion.

Is OAuth 2.0 supported?

Yes. OAuth 2.0 can be used to retrieve responses directly from the source if preferred.

Where is the API response stored?

The API response is stored in the bizdata_dataset_response dataset.

Notes

  • Ensure all credentials are securely stored.
  • Always test API connections before deployment.
  • Validate extracted fields before loading into production.
  • Review logs regularly for integration issues.
  • Update minorversion when API versions change.

This guide enables reliable integration between QuickBooks and Bizintel360 Datalake using API-based authentication and structured data processing.

Updated on February 18, 2026

What are your Feelings

  • Happy
  • Normal
  • Sad

Share This Article :

  • Facebook
  • X
  • LinkedIn
  • Pinterest
Overcoming Upsert Limitations: A Case Study on Enabling Upsert Operations in APIs without Inherent SupportConnecting Salesforce to Netsuite
Table of Contents
  • Overview
  • When to Use
  • How It Works
  • How to Configure
    • Data Source Configuration
      • Step 1: Select Source Type
      • Step 2: Create Product
      • Step 3: Provide Product Details
      • Step 4: Configure Access Token API
  • Data Operations
    • Data Extractor
    • API Operation
      • System Tab
      • Test Tab
    • XML to JSON
    • Eliminate
    • Single-Line to Multi-Line
  • Data Target Configuration
    • Step 7: Configure Datalake Target
  • Troubleshooting
  • Frequently Asked Questions
    • What authentication method is used in this configuration?
    • Can I use a different business object instead of Payment?
    • Why is XML to JSON conversion required?
    • Is OAuth 2.0 supported?
    • Where is the API response stored?
  • Notes
© Copyright 2026 Bizdata Inc. | All Rights Reserved | Terms of Use | Privacy Policy