Skip to the content

Automate Everything !

🤖 Explore with AI: ChatGPT Perplexity Claude Google AI Grok

For Enterprises | Teams | Start-Ups

eZintegrations

eZintegrations – AI Workflows & AI Agents Automation Hub

Automate to Innovate

0
$0.00
eZintegrations

eZintegrations – AI Workflows & AI Agents Automation Hub

Automate to Innovate

Menu
0
$0.00
  • Categories
    • Workflow Automation
    • AI Workflow
    • AI Agent
    • Agentic AI
  • Home
  • Automate Now !
  • About Us
  • Contact
  • Blog
  • Pricing
  • Free AI Workflow
  • Free AI Agents

eZintegrations

  • eZintegrations Introduction
  • Integration Bridge
    • Rename Integration Bridge
    • Enable and Disable Integration Bridge
    • Integration Bridge Save
    • Integration Bridge Run Once
    • Clear Logs of An Integration Bridge
    • Integration Bridge Share Feature
    • Copy Operation
    • Integration Bridge Import/Export
    • Integration Bridge Auto Save Feature
    • View An Integration Bridge
    • Copy Integration Bridge
    • Streaming Logs of Integration Bridge
    • Download Logs of An Integration Bridge
    • Status of Integration Bridge
    • Refresh an Integration Bridge
    • Stop An Integration Bridge
    • Start An Integration Bridge
    • Frequency
  • Feedback
    • Feedback: Tell Us What You Think
  • Understanding Session Timeout
    • Understanding Session Timeout and the Idle Countdown Timer
  • Alerts
    • Alerts
  • Marketplace
    • Marketplace
  • DIY Articles
    • 60+ Transformations for Smarter Data: How eZintegrations Powers Operations
    • From SOAP to GraphQL: Modernizing Integrations with eZintegrations
    • Accelerate Growth with eZintegrations Unified API Marketplace
    • Collaborative Integrations: Sharing Bridges in eZintegrations to Foster Cross-Team Innovation
    • Unlocking Hidden Value in Unstructured Data: eZintegrations AI Document Magic for Strategic Insights
    • Workflow Cloning Wizardry: Replicating Success with eZintegrations Integration Duplication for Rapid Scaling
    • Time Zone Triumph: Global Scheduling in eZintegrations for Synchronized Cross-Border Operations
    • Parallel Processing Power: eZintegrations Multi-Threaded Workflows for Lightning Fast Data Syncs
    • From Data Chaos to Competitive Edge: How eZintegrations AI Syncs Silos and Boosts ROI by 40%
    • From Emails to Insights: eZintegrations AI Turns Chaos into Opportunity
    • Handling XML Responses in eZintegrations
    • Text to Action: Shape Data with Plain English or Python in eZintegrations
    • AI Magic: Send Data to Any Database with a Simple English Prompt in eZintegrations
    • Configuring Netsuite as Source
    • Configuring Salesforce as Source
    • Overcoming Upsert Limitations: A Case Study on Enabling Upsert Operations in APIs without Inherent Support
    • Connecting QuickBooks to Datalake
    • Connecting Salesforce to Netsuite
    • Connecting My-SQL to Salesforce Using Bizdata Universal API
    • Effortless Integration Scheduling: Mastering Biweekly Execution with eZintegrations
    • Connecting MS-SQL or Oracle Database to Salesforce Using Bizdata Universal API
    • Establishing Token-Based Authentication within NetSuite
    • Registering a Salesforce App and Obtaining Client ID / Secret (for API Calls / OAuth)
  • Management
    • Adding Users and Granting Organization Admin Privileges : Step-by-Step Guide
    • Security Matrix
    • Adding Users as an Organization Admin (Step-by-Step Guide)
  • Appendix
    • Pivot Operation Use Cases
    • Efficient Column Renaming in eZintegration Using Python Operation
    • Filter Operation Use Cases
    • Connecting any Database to Database
    • Connecting Data Targets
    • Connecting Data Sources
  • Release Notes
    • Release Notes
  • Accounting & Billing
    • Invoices
    • Billing Information
    • Payment Method
    • Current Plan
    • Plans
    • Dashboard
  • My Profile
    • My Profile
  • OnBoarding
    • Microsoft Login
    • Multi-Factor Authentication
    • Login for New Users
  • Pycode Examples
    • Extract Domain Name from Email using Split
    • Split String with Regular Expression
    • Bulk Rename of Keys
    • Form a JSON Object from array of array
    • URL Parsing
    • Form a JSON Object based on the key and values available in JSON Dataset
    • Convert Empty String in a JSON to a “null” value
    • Generate a OAuth 1.0 Signature or Store a Code Response in a User Defined Variable
    • Rename JSON Key based on other key’s value
  • Sprintf
    • Sprintf
  • Data Source Management
    • Data Source Management
  • Data Source API
    • Response Parameters: Text, XML, and JSON Formats
    • Environment Settings for Reusable and Dynamic Configuration
    • API Numeric Parameters for Pagination and Record Limits
    • API Time Parameters for Date and Time Filtering
    • How to test the Data Source API
    • Pre- Request Scripts
      • Pre- Request Scripts for Amazon S3
      • Pre- Request Scripts for Oracle Netsuite
      • Pre-Request Script for Amazon SP API
      • Pre-Request Scripts
    • API Pagination Methods
      • Custom Pagination
      • Encoded Next Token Pagination
      • Cursor Pagination
      • Pagination with Body
      • Total Page Count Pagination
      • Offset Pagination
      • Next URL Pagination
      • API Pagination Introduction
      • Pagination examples
        • SAP Shipment API Pagination
        • Amazon SP API Pagination
    • API Authorization
      • OAuth 2.0 Authorization
      • OAuth 1.0 Authorization
      • Basic Authentication Method
      • API Key Authorization Method
      • Different Types of API Authorization
  • Console
    • Console: Check Your Data at Every Step
  • eZintegrations Dashboard Overview
    • eZintegrations Dashboard Overview
  • Monitoring Dashboard
    • Monitoring Dashboard
  • Advanced Settings
    • Advanced Settings
  • Summary
    • Summary
  • Data Target- Email
    • Data Target- Email
  • Data Target- Bizintel360 Datalake Ingestion
    • Data Target- Goldfinch Analytics Datalake Ingestion
  • Data Target- Database
    • Data Target – Database SQL Examples
    • Database as a Data Target
  • Data Target API
    • Response Parameters
    • REST API Target
    • Pre-Request Script
    • Test the Data Target
  • Bizdata Dataset
    • Bizdata Dataset Response
  • Data Source- Email
    • Extract Data from Emails
  • Data Source- Websocket
    • WebSocket Data Source Overview
  • Data Source Bizdata Data Lake
    • How to Connect Data Lake as Source
  • Data Source Database
    • How to connect Data Source Database
  • Data Operations
    • Deep Learning
    • Data Orchestration
    • Data Pipeline Controls
    • Data Cleaning
    • Data Wrangling
    • Data Transformation

Goldfinch AI

  • Goldfinch AI Introduction

Bizdata API

  • Universal API for Database
    • API for PostgreSQL Database – Universal API
    • API for Amazon Aurora Database (MySQL/Maria) – Universal API
    • API for Amazon Redshift Database – Universal API
    • API for Snowflake Database – Universal API
    • API for MySQL/Maria Database – Universal API
    • API for MS-SQL Database-Universal API
    • API for Oracle Database- Universal API
    • Introduction to Universal API for Databases
  • SFTP API
    • SFTP API
  • Document Understanding APIs
    • Document Understanding API- Extract data from Documents
  • Web Crawler API
    • Web Crawler API – Fast Website Scraping
  • AI Workflow Testing APIs
    • Netsuite Source Testing API (Netsuite API Replica)
    • Salesforce Testing API (Salesforce API replica)
    • OAuth2.0 Testing API 
    • Basic Auth Testing API 
    • No Auth Testing API
    • Pagination with Body Testing API
    • Next URL Pagination Testing API 
    • Total Page Count Pagination Testing API
    • Cursor Pagination Testing API 
    • Offset Pagination Testing API
  • Import IB API
    • Import Integration service with .JSON file
  • Linux File & Folder Monitoring APIs
    • Monitor Linux Files & Folder using APIs
  • Webhook
    • Webhook Integration-Capture Events in Real Time
  • Websocket
    • Websocket Integration- Fetch Real Time Data
  • Image Understanding
    • Image Understanding API – Extract data from Images

Goldfinch Analytics

  • Visualization Login
    • Enabling Two Factor Authentication
    • Visualization login for analytics users
  • Profile
    • Profile
  • Datalake
    • Datalake
  • Discover
    • Discover
  • Widgets
    • Filter
    • Widget List
    • Widgets Guide
    • Creating Widgets & Adding Widgets to Dashboard
  • Dashboard
    • Dashboard
  • Views
    • Views
  • Filter Queries
    • Filter Queries for Reports and Dashboard
  • Alerts
    • Alerts
  • Management
    • Management
  • Downloading Reports with Filtered Data
    • Downloading Reports with Filtered Data in Goldfinch Analytics
  • Downloads
    • Downloads – eZintegrations Documents & Resources | Official Guides & Manuals
View Categories

Connecting any Database to Database

Overview

Information is often stored in databases using different architectures, naming conventions, and formats. To align this data between systems, careful mapping and transformation are required.

eZintegrations provides multiple operations to transform, clean, and map data across databases. This enables users to build reliable and efficient Database-to-Database integrations.

This guide explains how to configure a database as both a source and a target to retrieve, process, and transfer data between systems.

When to Use

Use Database-to-Database integration when data must be synchronized or migrated between two database systems.

  • When transferring records between heterogeneous databases
  • When consolidating data into a central repository
  • When performing scheduled data replication
  • When transforming source data before storage
  • When handling historical and real-time data loads

How It Works

The integration follows a structured workflow to extract, transform, and load data.

  • Configure the source database connection
  • Execute SQL queries to retrieve data
  • Apply operations for transformation
  • Convert records into batch-friendly formats
  • Configure the target database
  • Insert transformed data into the destination system

How to Configure

Data Source Configuration

Step 1: Select Source Type

Select Database as the source type.

Step 2: Select Storage Name

Choose the required database from the dropdown list.

Example:

  • Oracle Database

Step 3: Configure Database Details

After selecting the storage name, configure the following parameters.

Version

Select the database version.

Example: Latest

Database Credentials

Provide the following details:

  • Host IP Address
  • Port Number
  • Schema Name
  • Username
  • Password
Chunk Size

Defines the number of records streamed in one batch.

  • Recommended: 1000
  • Maximum: Up to 10,000+ for historical loads
Response Parameters

Response parameters define key-value mappings for different status codes returned by the integration.

They control how responses are interpreted and modified.

SQL Statement

Enter the SQL query in the provided field and use the Execute button to preview results.

Data Operations

Operations are used to transform, clean, and structure data before sending it to the target.

Singleline to Multiline

Converts single-line JSON into multiline format for easier processing.

Value to be passed:

['bizdata_dataset_response']['data']

This converts data stored under data into multiline format.

Append

Append is used to create new key-value pairs and map them to target fields.

Example:

"new_key": "new_value"

Data Aggregation

Data Aggregation groups records into arrays for structured processing.

Parameters

  • Agg Data Key: Empty for multiline data, populated for singleline data
  • Groupby Key: Unique identifier for grouping
  • Array Key: Key for storing grouped data
  • Array Key Nested Columns: Comma-separated nested keys

Example:

['bizdata_dataset_response']['items']

Singleline to Tuple

This operation creates comma-separated tuples for bulk database insertion.

Parameters

  • Singleline Key: Holds singleline dataset
  • Table Headers: Defines tuple column order
  • Tuple Key: Stores generated tuple data

Example:

"Order","Customer Name","Product"

Data Target Configuration

Step 1: Select Target Type

Select Database as the target type.

Step 2: Select Storage Name

Choose the destination database.

Step 3: Configure Target Credentials

Provide Host, Port, Schema, Username, and Password.

Step 4: Configure Tuple Key

Provide the tuple key created in the Singleline to Tuple operation.

Example:

['Orderdetail']

Step 5: Configure Batch Size

Batch Size defines how many records are sent per transaction.

  • Recommended: 1000

Step 6: Configure SQL Statement

SQL queries must be enclosed in three double quotes.

Example:

"""Insert into table_name (column1,column2) values (?,?)""","column1","column2"

Final Submission

After configuring source, operations, and target:

  • Review all parameters
  • Click Save and Submit
  • Activate the Integration Bridge

Once submitted, the connection between the two databases is established.

Troubleshooting

  • Connection Failed: Verify credentials and network access.
  • Query Error: Validate SQL syntax.
  • No Records Sent: Check tuple generation.
  • Batch Failure: Reduce batch size.
  • Invalid Mapping: Review Append and Aggregation steps.

Frequently Asked Questions

Can I connect different database types?

Yes. eZintegrations supports heterogeneous database integration.

Is batching mandatory?

Batching is recommended for performance but can be adjusted.

Can I transform data before loading?

Yes. Operations allow extensive data transformation.

Can this integration be scheduled?

Yes. Scheduling is available from the Summary page.

Is preview available before execution?

Yes. The Execute button allows query validation.

Notes

  • Always test SQL queries before deployment.
  • Secure database credentials.
  • Optimize queries for large datasets.
  • Monitor execution logs.
  • Adjust batch sizes for performance.

This guide enables reliable Database-to-Database integration using eZintegrations with configurable operations and batch processing.

Updated on February 19, 2026

What are your Feelings

  • Happy
  • Normal
  • Sad

Share This Article :

  • Facebook
  • X
  • LinkedIn
  • Pinterest
Filter Operation Use CasesConnecting Data Targets
Table of Contents
  • Overview
  • When to Use
  • How It Works
  • How to Configure
    • Data Source Configuration
      • Step 1: Select Source Type
      • Step 2: Select Storage Name
      • Step 3: Configure Database Details
        • Version
        • Database Credentials
        • Chunk Size
        • Response Parameters
        • SQL Statement
  • Data Operations
    • Singleline to Multiline
    • Append
    • Data Aggregation
      • Parameters
    • Singleline to Tuple
      • Parameters
  • Data Target Configuration
    • Step 1: Select Target Type
    • Step 2: Select Storage Name
    • Step 3: Configure Target Credentials
    • Step 4: Configure Tuple Key
    • Step 5: Configure Batch Size
    • Step 6: Configure SQL Statement
  • Final Submission
  • Troubleshooting
  • Frequently Asked Questions
    • Can I connect different database types?
    • Is batching mandatory?
    • Can I transform data before loading?
    • Can this integration be scheduled?
    • Is preview available before execution?
  • Notes
© Copyright 2026 Bizdata Inc. | All Rights Reserved | Terms of Use | Privacy Policy