Live Webinar | Jan 15, 11 AM EST — Search, Import, Automate: How Enterprises Launch AI Workflows in Minutes. Register Now !

Skip to the content

Automate Everything !

🤖 Explore with AI: ChatGPT Perplexity Claude Google AI Grok

For Enterprises | Teams | Start-Ups

eZintegrations – AI Workflows & AI Agents Automation Hub

eZintegrations – AI Workflows & AI Agents Automation Hub

Automate to Innovate

0
$0.00
eZintegrations – AI Workflows & AI Agents Automation Hub

eZintegrations – AI Workflows & AI Agents Automation Hub

Automate to Innovate

Menu
0
$0.00
  • Categories
    • Workflow Automation
    • AI Workflow
    • AI Agent
    • Agentic AI
  • Home
  • Automate Now !
  • About Us
  • Contact
  • Blog
  • Free AI Workflow
  • Free AI Agents

eZintegrations

  • eZintegrations Introduction
  • Integration Bridge
    • Rename Integration Bridge
    • Enable and Disable Integration Bridge
    • Integration Bridge Save
    • Integration Bridge Run Once
    • Clear Logs of An Integration Bridge
    • Integration Bridge Share Feature
    • Copy Operation
    • Integration Bridge Import/Export
    • Integration Bridge Auto Save Feature
    • View An Integration Bridge
    • Copy Integration Bridge
    • Streaming Logs of Integration Bridge
    • Download Logs of An Integration Bridge
    • Status of Integration Bridge
    • Refresh an Integration Bridge
    • Stop An Integration Bridge
    • Start An Integration Bridge
    • Frequency
  • Feedback
    • Feedback: Tell Us What You Think
  • Understanding Session Timeout
    • Understanding Session Timeout and the Idle Countdown Timer
  • Alerts
    • Alerts
  • Marketplace
    • Marketplace
  • DIY Articles
    • 60+ Transformations for Smarter Data: How eZintegrations Powers Operations
    • From SOAP to GraphQL: Modernizing Integrations with eZintegrations
    • Accelerate Growth with eZintegrations Unified API Marketplace
    • Collaborative Integrations: Sharing Bridges in eZintegrations to Foster Cross-Team Innovation
    • Unlocking Hidden Value in Unstructured Data: eZintegrations AI Document Magic for Strategic Insights
    • Workflow Cloning Wizardry: Replicating Success with eZintegrations Integration Duplication for Rapid Scaling
    • Time Zone Triumph: Global Scheduling in eZintegrations for Synchronized Cross-Border Operations
    • Parallel Processing Power: eZintegrations Multi-Threaded Workflows for Lightning Fast Data Syncs
    • From Data Chaos to Competitive Edge: How eZintegrations AI Syncs Silos and Boosts ROI by 40%
    • From Emails to Insights: eZintegrations AI Turns Chaos into Opportunity
    • Handling XML Responses in eZintegrations
    • Text to Action: Shape Data with Plain English or Python in eZintegrations
    • AI Magic: Send Data to Any Database with a Simple English Prompt in eZintegrations
    • Configuring Netsuite as Source
    • Configuring Salesforce as Source
    • Overcoming Upsert Limitations: A Case Study on Enabling Upsert Operations in APIs without Inherent Support
    • Connecting QuickBooks to Datalake
    • Connecting Salesforce to Netsuite
    • Connecting My-SQL to Salesforce Using Bizdata Universal API
    • Effortless Integration Scheduling: Mastering Biweekly Execution with eZintegrations
    • Connecting MS-SQL or Oracle Database to Salesforce Using Bizdata Universal API
    • Establishing Token-Based Authentication within NetSuite
    • Registering a Salesforce App and Obtaining Client ID / Secret (for API Calls / OAuth)
  • Management
    • Adding Users and Granting Organization Admin Privileges : Step-by-Step Guide
    • Security Matrix
    • Adding Users as an Organization Admin (Step-by-Step Guide)
  • Appendix
    • Pivot Operation Use Cases
    • Efficient Column Renaming in eZintegration Using Python Operation
    • Filter Operation Use Cases
    • Connecting any Database to Database
    • Connecting Data Targets
    • Connecting Data Sources
  • Release Notes
    • Release Notes
  • Accounting & Billing
    • Invoices
    • Billing Information
    • Payment Method
    • Current Plan
    • Plans
    • Dashboard
  • My Profile
    • My Profile
  • OnBoarding
    • Microsoft Login
    • Multi-Factor Authentication
    • Login for New Users
  • Pycode Examples
    • Extract Domain Name from Email using Split
    • Split String with Regular Expression
    • Bulk Rename of Keys
    • Form a JSON Object from array of array
    • URL Parsing
    • Form a JSON Object based on the key and values available in JSON Dataset
    • Convert Empty String in a JSON to a “null” value
    • Generate a OAuth 1.0 Signature or Store a Code Response in a User Defined Variable
    • Rename JSON Key based on other key’s value
  • Sprintf
    • Sprintf
  • Data Source Management
    • Data Source Management
  • Data Source API
    • Response Parameters: Text, XML, and JSON Formats
    • Environment Settings for Reusable and Dynamic Configuration
    • API Numeric Parameters for Pagination and Record Limits
    • API Time Parameters for Date and Time Filtering
    • How to test the Data Source API
    • Pre- Request Scripts
      • Pre- Request Scripts for Amazon S3
      • Pre- Request Scripts for Oracle Netsuite
      • Pre-Request Script for Amazon SP API
      • Pre-Request Scripts
    • API Pagination Methods
      • Custom Pagination
      • Encoded Next Token Pagination
      • Cursor Pagination
      • Pagination with Body
      • Total Page Count Pagination
      • Offset Pagination
      • Next URL Pagination
      • API Pagination Introduction
      • Pagination examples
        • SAP Shipment API Pagination
        • Amazon SP API Pagination
    • API Authorization
      • OAuth 2.0 Authorization
      • OAuth 1.0 Authorization
      • Basic Authentication Method
      • API Key Authorization Method
      • Different Types of API Authorization
  • Console
    • Console: Check Your Data at Every Step
  • eZintegrations Dashboard Overview
    • eZintegrations Dashboard Overview
  • Monitoring Dashboard
    • Monitoring Dashboard
  • Advanced Settings
    • Advanced Settings
  • Summary
    • Summary
  • Data Target- Email
    • Data Target- Email
  • Data Target- Bizintel360 Datalake Ingestion
    • Data Target- Goldfinch Analytics Datalake Ingestion
  • Data Target- Database
    • Data Target – Database SQL Examples
    • Database as a Data Target
  • Data Target API
    • Response Parameters
    • REST API Target
    • Pre-Request Script
    • Test the Data Target
  • Bizdata Dataset
    • Bizdata Dataset Response
  • Data Source- Email
    • Extract Data from Emails
  • Data Source- Websocket
    • WebSocket Data Source Overview
  • Data Source Bizdata Data Lake
    • How to Connect Data Lake as Source
  • Data Source Database
    • How to connect Data Source Database
  • Data Operations
    • Deep Learning
    • Data Orchestration
    • Data Pipeline Controls
    • Data Cleaning
    • Data Wrangling
    • Data Transformation

Goldfinch AI

  • Goldfinch AI Introduction

Bizdata API

  • Universal API for Database
    • API for PostgreSQL Database – Universal API
    • API for Amazon Aurora Database (MySQL/Maria) – Universal API
    • API for Amazon Redshift Database – Universal API
    • API for Snowflake Database – Universal API
    • API for MySQL/Maria Database – Universal API
    • API for MS-SQL Database-Universal API
    • API for Oracle Database- Universal API
    • Introduction to Universal API for Databases
  • SFTP API
    • SFTP API
  • Document Understanding APIs
    • Document Understanding API- Extract data from Documents
  • Web Crawler API
    • Web Crawler API – Fast Website Scraping
  • AI Workflow Testing APIs
    • Netsuite Source Testing API (Netsuite API Replica)
    • Salesforce Testing API (Salesforce API replica)
    • OAuth2.0 Testing API 
    • Basic Auth Testing API 
    • No Auth Testing API
    • Pagination with Body Testing API
    • Next URL Pagination Testing API 
    • Total Page Count Pagination Testing API
    • Cursor Pagination Testing API 
    • Offset Pagination Testing API
  • Import IB API
    • Import Integration service with .JSON file
  • Linux File & Folder Monitoring APIs
    • Monitor Linux Files & Folder using APIs
  • Webhook
    • Webhook Integration-Capture Events in Real Time
  • Websocket
    • Websocket Integration- Fetch Real Time Data
  • Image Understanding
    • Image Understanding API – Extract data from Images

Goldfinch Analytics

  • Visualization Login
    • Enabling Two Factor Authentication
    • Visualization login for analytics users
  • Profile
    • Profile
  • Datalake
    • Datalake
  • Discover
    • Discover
  • Widgets
    • Filter
    • Widget List
    • Widgets Guide
    • Creating Widgets & Adding Widgets to Dashboard
  • Dashboard
    • Dashboard
  • Views
    • Views
  • Filter Queries
    • Filter Queries for Reports and Dashboard
  • Alerts
    • Alerts
  • Management
    • Management
  • Downloading Reports with Filtered Data
    • Downloading Reports with Filtered Data in Goldfinch Analytics
  • Downloads
    • Downloads – eZIntegrations Documents & Resources | Official Guides & Manuals
View Categories

Connecting any Database to Database

Information is frequently stored in databases under various architectures, naming standards, and formats. To overcome these discrepancies, it takes careful mapping and transformation procedures to align data items across source and target databases.

eZintegrations, an iPaaS integration tool provides various operations to transform and map the data across various systems. This makes the integration process smoother and more efficient for the users. 

In this user guide, users will get step-by-step instructions on configuring the Database as a source and a target. By following this guide, users can establish a connection, retrieve data from their chosen source database and send it to Target. 

The steps to configure the Database are:

Step 1: Select Source as Database.

Step 2: Select your Storage Name (Database) as Oracle Database or as per the requirement from the list.

Step 3: Once the storage name is selected, add the required credentials and the SQL query. Users can test their query below in the SQL Statement box by clicking on the “Execute Button”.

Select Storage Name

In this section, users need to select the database from the drop-down list, that they will require for the integration.

E.g.: Oracle Database

Version

Once the storage (database) is selected, users need to select the associated version with their database.

E.g.: Latest

Database Credentials

To establish their database connection, users need to provide the Host IP Address, Port Number, Schema Name, Username, and Password of their database.

Chunk Size

This represents the size of records that can be streamed in one go. The ideal size for streaming records from Bizintel360 is 1000, which can be extended up to 10,000 or more records for one-time historical data loads.

Response Parameters

Response parameters are set by the user for every status code that the integration provides. A key-value map represents the response parameters. The key indicates where the request parameter is located and how to modify it. The new data for the parameter is specified by the value.

SQL Statement

Users need to provide their SQL query in the given area and using the Execute button, users can check the response in the response area.

Step 4: Click Next and go to Operations. Operations are used to transform, wrangle and clean the data before sending them to the target and can be modified based on the requirement of the data.

For sending records to the database, certain operations are used. They are:

  1. Singleline to Multiline

The Singleline to Multiline operation is the first step in converting records. It transforms single-line JSON data into a multiline format, making the data more organized and simpler to work with. This restructured format facilitates easier application of specific operations for individual entries, streamlining subsequent data tasks.

Value to be passed in Singleline to Multiline operation:

 

[‘bizdata_dataset_response’][‘data’]

In this, we are converting the value stored under [‘data’], from singleline to multiline.

  1. Append

To create a comma-separated new key-value pair, Append Operation is used. This operation helps in mapping the key created to the target key.

In Database-to-Database integration, Append is used to create a new key-value pair, wherein going forward all the keys will be stored.

Example:

 

“new_key”: “new_value”

 

iii. Data Aggregation

Data Aggregation operation combines or groups all the data into an array. For Database integration, data aggregation is done to combine all the list of columns coming from the source to an array.

This operation consists of multiple parameters:

Parameter: Agg Data Key

Agg Data Key is passed as empty for multiline data and will have data key in case of single line data.

Example: 

 

[‘bizdata_dataset_response’][‘items’]

Parameter: Groupby Key

Groupby Key provides the key name, or unique identifier inside the dataset, that the user intends to group by.

Example: 

 

“Order”

Parameter: Array Key

With an array key, the user can assign any key name to be used as the key for holding common keys.

Example: 

 

“Order Details”

Parameter: Array Key Nested Columns

Give the key names as comma-separated keys that should be available inside the Array Key.

 

  1. Singleline to Tuple

To send bulk records to the database, Tuple is created. Using this tuple, records are sent into batches to the database. Singleline to Tuple operation combines all the singleline values into a comma-separated tuple.

This operation consists of 3 parameters:

Parameter: Singleline Key

To read the dataset from a single line, Singleline Key is used.

Example: 

 

“Order”

In this, “Order” is used to hold the singleline data.

Parameter: Table Headers

Table Headers specifies the sequence of the converted tuple data.

Example: 

 

“Order”,”Customer Name”,”Product”

Parameter: Tuple Key

This key will hold the Tuple data.

Example: 

 

Orderdetail

Step 5: Click next, and go to Data Target. Select target as Database and add the database details along with SQL query.

 Select Storage Name

In this section, users need to select the database from the drop-down list, that they will require for the integration.

E.g.: Oracle Database

Version

Once the storage (database) is selected, users need to select the associated version with their database.

E.g.: Latest

Database Credentials

To establish their database connection, users need to provide the Host IP Address, Port Number, Schema Name, Username, and Password of their database.

Order Set of Values (Tuple Key)

Provide the key name that has been used for creating the tuple in the Singleline to Tuple operation. The Tuple key should be written in square brackets, enclosed in single quotes.

E.g.: 

 

[‘Orderdetail’]

Batch Size

The number of streaming records sent from the source is defined by the Batch Size in the target system, which maximizes transfer efficiency for real-time processing. Smoother data transfer across systems is made possible by a suggested value of 1000.

SQL Statement

Users need to provide their SQL query in the given area and using the Execute button, users can check the response in the response area.

SQL Statement to be passed with 3 double quotes,

E.g.: 

 

“””Insert into table_name (column1,column2) values (?,?)”””,”column1″,”column2″

Add all the details and save the bridge. Once the details are submitted, the connection is established between two databases, completing the Database to Database integration.

Updated on December 29, 2025

What are your Feelings

  • Happy
  • Normal
  • Sad

Share This Article :

  • Facebook
  • X
  • LinkedIn
  • Pinterest
Filter Operation Use CasesConnecting Data Targets
© Copyright 2026 Bizdata Inc. | All Rights Reserved | Terms of Use | Privacy Policy