Live Webinar | Jan 15, 11 AM EST — Search, Import, Automate: How Enterprises Launch AI Workflows in Minutes. Register Now !

Skip to the content

Automate Everything !

🤖 Explore with AI: ChatGPT Perplexity Claude Google AI Grok

For Enterprises | Teams | Start-Ups

eZintegrations – AI Workflows & AI Agents Automation Hub

eZintegrations – AI Workflows & AI Agents Automation Hub

Automate to Innovate

0
$0.00
eZintegrations – AI Workflows & AI Agents Automation Hub

eZintegrations – AI Workflows & AI Agents Automation Hub

Automate to Innovate

Menu
0
$0.00
  • Categories
    • Workflow Automation
    • AI Workflow
    • AI Agent
    • Agentic AI
  • Home
  • Automate Now !
  • About Us
  • Contact
  • Blog
  • Free AI Workflow
  • Free AI Agents

eZintegrations

  • eZintegrations Introduction
  • Integration Bridge
    • Rename Integration Bridge
    • Enable and Disable Integration Bridge
    • Integration Bridge Save
    • Integration Bridge Run Once
    • Clear Logs of An Integration Bridge
    • Integration Bridge Share Feature
    • Copy Operation
    • Integration Bridge Import/Export
    • Integration Bridge Auto Save Feature
    • View An Integration Bridge
    • Copy Integration Bridge
    • Streaming Logs of Integration Bridge
    • Download Logs of An Integration Bridge
    • Status of Integration Bridge
    • Refresh an Integration Bridge
    • Stop An Integration Bridge
    • Start An Integration Bridge
    • Frequency
  • Feedback
    • Feedback: Tell Us What You Think
  • Understanding Session Timeout
    • Understanding Session Timeout and the Idle Countdown Timer
  • Alerts
    • Alerts
  • Marketplace
    • Marketplace
  • DIY Articles
    • 60+ Transformations for Smarter Data: How eZintegrations Powers Operations
    • From SOAP to GraphQL: Modernizing Integrations with eZintegrations
    • Accelerate Growth with eZintegrations Unified API Marketplace
    • Collaborative Integrations: Sharing Bridges in eZintegrations to Foster Cross-Team Innovation
    • Unlocking Hidden Value in Unstructured Data: eZintegrations AI Document Magic for Strategic Insights
    • Workflow Cloning Wizardry: Replicating Success with eZintegrations Integration Duplication for Rapid Scaling
    • Time Zone Triumph: Global Scheduling in eZintegrations for Synchronized Cross-Border Operations
    • Parallel Processing Power: eZintegrations Multi-Threaded Workflows for Lightning Fast Data Syncs
    • From Data Chaos to Competitive Edge: How eZintegrations AI Syncs Silos and Boosts ROI by 40%
    • From Emails to Insights: eZintegrations AI Turns Chaos into Opportunity
    • Handling XML Responses in eZintegrations
    • Text to Action: Shape Data with Plain English or Python in eZintegrations
    • AI Magic: Send Data to Any Database with a Simple English Prompt in eZintegrations
    • Configuring Netsuite as Source
    • Configuring Salesforce as Source
    • Overcoming Upsert Limitations: A Case Study on Enabling Upsert Operations in APIs without Inherent Support
    • Connecting QuickBooks to Datalake
    • Connecting Salesforce to Netsuite
    • Connecting My-SQL to Salesforce Using Bizdata Universal API
    • Effortless Integration Scheduling: Mastering Biweekly Execution with eZintegrations
    • Connecting MS-SQL or Oracle Database to Salesforce Using Bizdata Universal API
    • Establishing Token-Based Authentication within NetSuite
    • Registering a Salesforce App and Obtaining Client ID / Secret (for API Calls / OAuth)
  • Management
    • Adding Users and Granting Organization Admin Privileges : Step-by-Step Guide
    • Security Matrix
    • Adding Users as an Organization Admin (Step-by-Step Guide)
  • Appendix
    • Pivot Operation Use Cases
    • Efficient Column Renaming in eZintegration Using Python Operation
    • Filter Operation Use Cases
    • Connecting any Database to Database
    • Connecting Data Targets
    • Connecting Data Sources
  • Release Notes
    • Release Notes
  • Accounting & Billing
    • Invoices
    • Billing Information
    • Payment Method
    • Current Plan
    • Plans
    • Dashboard
  • My Profile
    • My Profile
  • OnBoarding
    • Microsoft Login
    • Multi-Factor Authentication
    • Login for New Users
  • Pycode Examples
    • Extract Domain Name from Email using Split
    • Split String with Regular Expression
    • Bulk Rename of Keys
    • Form a JSON Object from array of array
    • URL Parsing
    • Form a JSON Object based on the key and values available in JSON Dataset
    • Convert Empty String in a JSON to a “null” value
    • Generate a OAuth 1.0 Signature or Store a Code Response in a User Defined Variable
    • Rename JSON Key based on other key’s value
  • Sprintf
    • Sprintf
  • Data Source Management
    • Data Source Management
  • Data Source API
    • Response Parameters: Text, XML, and JSON Formats
    • Environment Settings for Reusable and Dynamic Configuration
    • API Numeric Parameters for Pagination and Record Limits
    • API Time Parameters for Date and Time Filtering
    • How to test the Data Source API
    • Pre- Request Scripts
      • Pre- Request Scripts for Amazon S3
      • Pre- Request Scripts for Oracle Netsuite
      • Pre-Request Script for Amazon SP API
      • Pre-Request Scripts
    • API Pagination Methods
      • Custom Pagination
      • Encoded Next Token Pagination
      • Cursor Pagination
      • Pagination with Body
      • Total Page Count Pagination
      • Offset Pagination
      • Next URL Pagination
      • API Pagination Introduction
      • Pagination examples
        • SAP Shipment API Pagination
        • Amazon SP API Pagination
    • API Authorization
      • OAuth 2.0 Authorization
      • OAuth 1.0 Authorization
      • Basic Authentication Method
      • API Key Authorization Method
      • Different Types of API Authorization
  • Console
    • Console: Check Your Data at Every Step
  • eZintegrations Dashboard Overview
    • eZintegrations Dashboard Overview
  • Monitoring Dashboard
    • Monitoring Dashboard
  • Advanced Settings
    • Advanced Settings
  • Summary
    • Summary
  • Data Target- Email
    • Data Target- Email
  • Data Target- Bizintel360 Datalake Ingestion
    • Data Target- Goldfinch Analytics Datalake Ingestion
  • Data Target- Database
    • Data Target – Database SQL Examples
    • Database as a Data Target
  • Data Target API
    • Response Parameters
    • REST API Target
    • Pre-Request Script
    • Test the Data Target
  • Bizdata Dataset
    • Bizdata Dataset Response
  • Data Source- Email
    • Extract Data from Emails
  • Data Source- Websocket
    • WebSocket Data Source Overview
  • Data Source Bizdata Data Lake
    • How to Connect Data Lake as Source
  • Data Source Database
    • How to connect Data Source Database
  • Data Operations
    • Deep Learning
    • Data Orchestration
    • Data Pipeline Controls
    • Data Cleaning
    • Data Wrangling
    • Data Transformation

Goldfinch AI

  • Goldfinch AI Introduction

Bizdata API

  • Universal API for Database
    • API for PostgreSQL Database – Universal API
    • API for Amazon Aurora Database (MySQL/Maria) – Universal API
    • API for Amazon Redshift Database – Universal API
    • API for Snowflake Database – Universal API
    • API for MySQL/Maria Database – Universal API
    • API for MS-SQL Database-Universal API
    • API for Oracle Database- Universal API
    • Introduction to Universal API for Databases
  • SFTP API
    • SFTP API
  • Document Understanding APIs
    • Document Understanding API- Extract data from Documents
  • Web Crawler API
    • Web Crawler API – Fast Website Scraping
  • AI Workflow Testing APIs
    • Netsuite Source Testing API (Netsuite API Replica)
    • Salesforce Testing API (Salesforce API replica)
    • OAuth2.0 Testing API 
    • Basic Auth Testing API 
    • No Auth Testing API
    • Pagination with Body Testing API
    • Next URL Pagination Testing API 
    • Total Page Count Pagination Testing API
    • Cursor Pagination Testing API 
    • Offset Pagination Testing API
  • Import IB API
    • Import Integration service with .JSON file
  • Linux File & Folder Monitoring APIs
    • Monitor Linux Files & Folder using APIs
  • Webhook
    • Webhook Integration-Capture Events in Real Time
  • Websocket
    • Websocket Integration- Fetch Real Time Data
  • Image Understanding
    • Image Understanding API – Extract data from Images

Goldfinch Analytics

  • Visualization Login
    • Enabling Two Factor Authentication
    • Visualization login for analytics users
  • Profile
    • Profile
  • Datalake
    • Datalake
  • Discover
    • Discover
  • Widgets
    • Filter
    • Widget List
    • Widgets Guide
    • Creating Widgets & Adding Widgets to Dashboard
  • Dashboard
    • Dashboard
  • Views
    • Views
  • Filter Queries
    • Filter Queries for Reports and Dashboard
  • Alerts
    • Alerts
  • Management
    • Management
  • Downloading Reports with Filtered Data
    • Downloading Reports with Filtered Data in Goldfinch Analytics
  • Downloads
    • Downloads – eZIntegrations Documents & Resources | Official Guides & Manuals
View Categories

Data Orchestration

Database

Description : 

The Database operation allows users to perform essential SQL actions such as Insert, Update, Delete, or execute PLSQL procedures. This functionality empowers users to seamlessly interact with the database, enabling efficient data manipulation and retrieval.

Number of Parameters : 10

Parameter : Select Storage Name

In the Select Storage Name parameter, users have the flexibility to choose their preferred data storage solution from a diverse array of databases. The available options include:

  1. Oracle Database
  2. Microsoft SQL Server
  3. MySQL/MariaDB
  4. Snowflake
  5. Amazon Redshift
  6. Amazon Aurora (MySQL/MariaDB)
  7. Amazon Aurora (PostgreSQL)
  8. PostgreSQL
  9. Teradata DB
  10. IBM DB2
  11. SQLite

Parameter : Select Version

In the Select Version section, users are presented with a dropdown menu to designate the version they wish to employ for the previously selected storage name. This step allows for precise customization, ensuring compatibility and alignment with specific version requirements.

Parameter : Host IP

In the Host IP parameter, users are required to provide the Host IP Address for establishing a connection to the selected Storage Name. This information is essential for facilitating seamless communication and access to the specified storage solution.

Below is an example where we pass the Host IP address ‘10.0.0.100’ into the ‘Host IP’ parameter:

Host IP :

10.0.0.100

Parameter : Port Number

In the Port Number parameter, users are instructed to provide the specific port number for establishing the connection. This parameter is vital for ensuring the connection is directed to the correct port on the designated host.

By default the Port Number is “3306”

Port Number :

3306

Parameter : Schema Name

In this field, please provide the Schema Name where the target table is located. This information is necessary to locate and access the desired database schema for further operations.

Below is an example where we pass the Schema Name ‘Customer_info’ into the ‘Schema Name’ parameter:

Schema Name :

customer_info

Parameter : Username

In the ‘Username’ parameter, users are required to input the designated username for establishing the connection. This username serves as the authentication credential, allowing access to the specified database or system.

Below is an example where we pass the Username ‘john_peter’ into the ‘Username’ parameter :

Username :

john_peter

Parameter : Password

In the ‘Password’ parameter, users are required to provide the password associated with the specified username for establishing the connection. This password serves as a secure credential to authenticate and authorize access to the designated database or system.

Below is an example, where we pass the Password ‘SecurePassword@123’ into the ‘Password’ parameter :

Password :

SecurePassword@123

Parameter : Tuple Key

In the ‘Tuple Key’ parameter, if you wish to insert data in batches, specify the key name under which the batched data is organized. This key serves as a reference for the system to identify and process grouped data when performing batch operations.

Below is an example, where we pass the Tuple ‘[‘table_data’]’ into the ‘Tuple Key’ parameter, Here [‘table_data’]  holds the batched data :

Tuple Key :

['table_data']

Parameter : Batch Size

In the ‘Batch Size’ parameter, users should input the desired batch size for the insertion of data. This value determines the number of records processed in each batch, optimizing the efficiency of the data insertion operation.

By default the Batch Size is set to “1000”

Batch Size :

1000

Parameter : SQL Statement

In the ‘SQL Statement’ parameter, users are required to pass the query they wish to execute, enclosing the query inside double quotes. This parameter allows users to specify the SQL statement for the intended database operation.

Below is an example, Where we pass the SQL statement which is an insert query into a table named ‘customer_info’ with columns ‘column1’ and ‘column2’.

SQL Statement :

"""Insert into customer_info (column1,column2) values (?,?)""","column1","column2"

Note: SQL Statement to be passed with 3 double quotes.

 

Email

Description :

The Email operation serves as a versatile tool for sending emails, making it suitable for a wide range of purposes, including sending standard messages, important alert notifications, or any other type of email communication required.

Number of Parameters : 14

Tab : Compose

Parameter : To

In the ‘To’ parameter, users are required to input the recipient’s email address.

Below is an example where we pass the recipient’s email address as  ‘user@email.com’ into the ‘To’ parameter :

To :

user@email.com

Additionally, string interpolation can be employed to dynamically pass values to the ‘To’ parameter :

To :

{%receiver%}

Parameter : Cc

In the ‘Cc’ parameter, users are required to input the email addresses of additional recipients who will receive a copy (Cc) of the email.

Below is an example where we pass the additional recipient email address as ‘additional@email.com’ into the ‘Cc’ parameter :

Cc :

additional@email.com

Additionally, string interpolation can be employed to dynamically pass values to the ‘Cc’ parameter :

Cc :

{%additionalreceiver%}

Parameter : Bcc

In the ‘Bcc’ parameter, users are required to input the email addresses of additional recipients who will receive a blind carbon copy (Bcc) of the email. This provides a discreet way to send the email without revealing the list of Bcc recipients to the other recipients.

Below is an example where we pass the additional recipient email address who will receive a blind carbon copy of the mail  as ‘additional2@email.com’ into the ‘Bcc’ parameter :

Bcc :

additional2@email.com

Additionally, string interpolation can be employed to dynamically pass values to the ‘Bcc’ parameter :

Bcc :

{%additionalreceiver2%}

Note : These three (To,Cc,Bcc) will all be in array, inside array we can give either string Interpolated or just email address.

Parameter : Subject

In the ‘Subject’ parameter, users are expected to input a concise and informative subject line that succinctly conveys the purpose or content of the email.

Below is an example where we pass Subject of the mail  as ‘Records Status’ into the ‘Subject’ parameter :

Subject :

 

Records Status

Parameter : Source Key

In the ‘Source Key’ parameter, the Email operation offers the capability to dynamically introduce new keys, assigning values that can be either static or dynamic. For each record, a key and its corresponding value are added, with the source key parameter accommodating this process. In the case of multiline records, the source key will be passed as an empty string, ensuring adaptability to varying data structures.

Below is an example where we pass source key  as ‘Items’ into the ‘Source Key’ parameter :

Source Key :

Items

Parameter : Body

In the ‘Body’ parameter, users are provided with a text area box that offers the ability to utilize an HTML editor. This feature empowers users to craft rich and customized email content using HTML elements, ensuring a dynamic and visually appealing presentation of the message

Below is an example where we pass the following text into the ‘Body’ parameter :

Body :

This is a sample body text

Tab : Attachment

Parameter : Is Attachment ?

In the ‘Is Attachment?’ parameter, a toggle option is provided for users to specify whether an attachment is included with the email. Users can select ‘Yes’ to indicate the presence of an attachment, or ‘No’ if no attachment is included with the email.

Parameter : File Name

In the ‘File Name’ parameter, users can input the name of the attachment. This parameter is enabled only when the user selects ‘Yes’ in the ‘Is Attachment?’ parameter, signifying that an attachment is included with the email.

Below is an example where we pass the file name  as ‘order.tsv’ into the ‘File Name’ parameter :

File Name :

order.tsv

Tab : Settings

Parameter : From

In the ‘From’ parameter, users are required to input the sender’s email address.

Below is an example where we pass the sender’s email address  as ‘sender@email.com’ into the ‘From’ parameter :

From :

sender@email.com

Parameter : Password

In the ‘Password’ parameter, users need to input the password associated with the sender’s email address.

Parameter : Mail Server

In the ‘Mail Server’ parameter, users are prompted to specify the address or hostname of the email server that will be used to send the email.

Below is an example where we pass the value inside the mail server  as ‘mail.email.com’ into the ‘Mail Server’ parameter :

Mail Server :

mail.email.com

Parameter : Port Number

In the ‘Port Number’ parameter, users are required to input the specific port number that corresponds to the chosen mail server.

Below is an example where we pass port number as 123 into the ‘Port Number’ parameter :

Port Number :

123

Note : The ‘Test’ button in the UI serves as a valuable tool for evaluating the response of the configured Email Settings.

                                

DL Ingestion

Description :

The Datalake Ingestion operation is a versatile tool designed for efficiently and securely transferring data into Bizintel360 data lake. This operation empowers users to seamlessly ingest data from various sources into a centralized data lake repository.

Number of Parameters : 5

Parameter : Data Lake Version

The ‘Data Lake Version’ parameter is presented as a dropdown list, offering users a choice between two distinct Data Lake versions within eZintegrations Pipeline:

  • Data Lake from Bizdata after 2023
  • Data Lake from Bizdata before 2023Note : In both the versions the parameters are same except the operations name

This parameter allows users to specify the target Data Lake version, ensuring that data is ingested into the appropriate repository based on their selection.

Parameter : Index / Table Name

In the ‘Index / Table Name’ parameter, users are prompted to provide the specific name of the index or table where they intend to send their data in the Data Lake. This crucial parameter ensures that data is accurately directed to the intended destination.

Below is an example where we pass Index/Table Name as ‘table’ into the ‘Index/Table Name’ parameter :

Index/Table Name :

table

Parameter : Action Type`

In the ‘Action Type’ parameter, users can select the action type from a dropdown list, determining how data is processed:

  1. Upsert: Choosing ‘Upsert’ means that the system will perform an ‘Upsert’ action, which updates existing data if found or creates it if it’s missing. (Primary key is required)
  2. Update: Selecting ‘Update’ signifies an ‘Update’ action, used to modify existing data. (Primary key is required)
  3. Delete: Opting for ‘Delete’ leads to a ‘Delete’ action, allowing data removal. (Primary key is required)
  4. Create: When ‘Create’ is selected, the system executes a ‘Create’ action. (Primary key is not required)
  5. Insert: Choosing ‘Insert’ means the system performs an ‘Insert’ action. (Primary key is not required)

Parameter : Primary Key (only applicable to Upsert, Update, Delete)

In the ‘Primary Key’ parameter, users are required to specify the primary key that they wish to assign to the provided Index or Table Name. This primary key is a critical element for uniquely identifying and organizing the data within the designated destination, ensuring efficient data management and retrieval.

Below is an example where we pass primary key as ‘Id’ into the ‘Index/Table Name’ parameter :

Primary Key :

 

API

Description

This operation is helps to data transmission between one software product to another.
It is divided into ‘System’ and ‘Test’.

System

Step 1:
Select the Target name from Catalog. Targets are present in catalog and users can select it by scrolling through the list of target names or by entering the target name in the designated area.
If target name is not present in the catalog list, users can click on create new and create a new target name.

Step 2:
Select the Business object. Users can select the business object from catalog available for each target name.
If a user is creating a new target that is not present in the catalog, they can enter a relevant business object.

Test

The test area is used for testing.
If target and business object are entered through catalog, all the defined fields will be generated in the designated area accordingly. in this case, users are neruired to enter the authentication IDs and test.
If target is not entered through API catalog, users are required to enter the details in the specified areas and then test their API.

Response parameters:
It is an API Request parameter to getting the response as per your requirement.
We can send the Response type as Text, XML, and JSON and then it will give us response on that particular response parameters request. Once we will send the response parameter type in API Request it will process by backend python API and then it will give the particular response based on response type.

These are the following options we have for response params.
 Text.
 XML.
 JSON.
Below image is the UI Representation of Response Parmas in the eZintegrations platfom. You can get this feature inside IB (Integration Bridge) postman view in API as a source, operation & target.

Screenshot.png

Note: Text in response params is selected by default. Users can change it as per their requirements.

 

Pre-Request Script of Python

Pre-request script is a piece of code that will run before the execution of a request. Pre-request scripts gives users a chance to modify the request after variables have been resolved but before the request is made.

Example- Generating signatures for authentication

Pre-Request Script of Python

Pre-processing tasks including setting parameters, variable values, body data and headers can be performed using the pre-request script. Pre-request scripts can also be used to debug the code, for example, by logging output to the console. Additionally, we may obtain the result of the function, such as the date, time, timestamp, etc., utilising the pre-request script notion.

The code that is run prior to sending an HTTP request using an application like requests is known as the “pre-request” script in Python. Before submitting the actual request, the pre-request script is used to change the response fields or headers.

Below are the various examples of Pre-Request Scripts

Amazon SP API

Method – POST


import time
import datetime, hashlib, hmac
import json
access_key='{{access_key}}'                 # Provide Values     
secret_key='{{secret_key}}'                 # Provide Values
host = '{{host}}'                           # Provide Values
endpoint = '{{hostname}}'                   # Provide Values
canonical_uri = '{{canonical_uri}}'         # Provide Values
body = {{body}}                             # Provide Values
##########################################################################################################################################################
request_parameters =json.dumps(body)
t = datetime.datetime.utcnow()
amzdate = t.strftime('%Y%m%dT%H%M%SZ')
datestamp = t.strftime('%Y%m%d')
method = 'POST'
service = 'execute-api'
region = 'us-east-1'
canonical_querystring = ''
canonical_headers = 'host:' + host + '\\n' + 'x-amz-date:' + amzdate + '\\n'
signed_headers = 'host;x-amz-date'
payload_hash = hashlib.sha256((request_parameters).encode('utf-8')).hexdigest()
canonical_request = method + '\\n' + canonical_uri + '\\n' + canonical_querystring + '\\n' + canonical_headers + '\\n' + signed_headers + '\\n' + payload_hash
algorithm = 'AWS4-HMAC-SHA256'
credential_scope = datestamp + '/' + region + '/'+ service + '/' + 'aws4_request'
string_to_sign = algorithm + '\\n' +  amzdate + '\\n' +  credential_scope + '\\n' +  hashlib.sha256(canonical_request.encode('utf-8')).hexdigest()
kDate = hmac.new(('AWS4' + secret_key).encode('utf-8'), datestamp.encode('utf-8'), hashlib.sha256).digest()
kRegion = hmac.new(kDate, region.encode('utf-8'), hashlib.sha256).digest()
kService = hmac.new(kRegion, service.encode('utf-8'), hashlib.sha256).digest()
kSigning = hmac.new(kService, 'aws4_request'.encode('utf-8'), hashlib.sha256).digest()
signing_key = kSigning
signature = hmac.new(signing_key, (string_to_sign).encode('utf-8'), hashlib.sha256).hexdigest()
authorization_header = algorithm + ' ' + 'Credential=' + access_key + '/' + credential_scope + ', ' +  'SignedHeaders=' + signed_headers + ', ' + 'Signature=' + signature

Method – POST (with Body)


import time
import datetime, hashlib, hmac
import json
access_key='XXXXXXXXXXXXXXXXXX'
secret_key='XXXXXXXXXXXXXXXXXX'
method = 'POST'
service = 'execute-api'
host = 'sellingpartnerapi-na.amazon.com'
region = 'us-east-1'
endpoint = 'https://sellingpartnerapi-na.amazon.com'
body = {'reportType': 'GET_FLAT_FILE_ALL_ORDERS_DATA_BY_ORDER_DATE_GENERAL','dataStartTime': '{%yesterday%}T00:00:01','dataEndTime': '{%yesterday%}T23:59:59','marketplaceIds': ['XXXXXXXXXX']}
request_parameters =json.dumps(body)
t = datetime.datetime.utcnow()
amzdate = t.strftime('%Y%m%dT%H%M%SZ')
datestamp = t.strftime('%Y%m%d')
canonical_uri = '/reports/2021-06-30/reports'
canonical_querystring = ''
canonical_headers = 'host:' + host + '\\n' + 'x-amz-date:' + amzdate + '\\n'
signed_headers = 'host;x-amz-date'
payload_hash = hashlib.sha256((request_parameters).encode('utf-8')).hexdigest()
canonical_request = method + '\\n' + canonical_uri + '\\n' + canonical_querystring + '\\n' + canonical_headers + '\\n' + signed_headers + '\\n' + payload_hash
algorithm = 'AWS4-HMAC-SHA256'
credential_scope = datestamp + '/' + region + '/'+ service + '/' + 'aws4_request'
string_to_sign = algorithm + '\\n' +  amzdate + '\\n' +  credential_scope + '\\n' +  hashlib.sha256(canonical_request.encode('utf-8')).hexdigest()
kDate = hmac.new(('AWS4' + secret_key).encode('utf-8'), datestamp.encode('utf-8'), hashlib.sha256).digest()
kRegion = hmac.new(kDate, region.encode('utf-8'), hashlib.sha256).digest()
kService = hmac.new(kRegion, service.encode('utf-8'), hashlib.sha256).digest()
kSigning = hmac.new(kService, 'aws4_request'.encode('utf-8'), hashlib.sha256).digest()
signing_key = kSigning
signature = hmac.new(signing_key, (string_to_sign).encode('utf-8'), hashlib.sha256).hexdigest()
authorization_header = algorithm + ' ' + 'Credential=' + access_key + '/' + credential_scope + ', ' +  'SignedHeaders=' + signed_headers + ', ' + 'Signature=' + signature

Method – GET


import datetime, hashlib, hmac
host = '{{host}}'                      # Provide Values
endpoint = '{{hostname}}'              # Provide Values
access_key = '{{access_key}}'          # Provide Values
secret_key = '{{secret_key}}'          # Provide Values
canonical_uri = '{{canonical_uri}}'    # Provide Values
##########################################################################################################################################################
t = datetime.datetime.utcnow()
amzdate = t.strftime('%Y%m%dT%H%M%SZ')
datestamp = t.strftime('%Y%m%d')
method = 'GET'
service = 'execute-api'
region = 'us-east-1'
canonical_headers = 'host:' + host + '\\n' + 'x-amz-date:' + amzdate + '\\n'
signed_headers = 'host;x-amz-date'
payload_hash = hashlib.sha256(('').encode('utf-8')).hexdigest()
canonical_request = method + '\\n' + canonical_uri + '\\n' +canonical_headers + '\\n' + signed_headers + '\\n' + payload_hash
algorithm = 'AWS4-HMAC-SHA256'
credential_scope = datestamp + '/' + region + '/' + service + '/' + 'aws4_request'
string_to_sign = algorithm + '\\n' +  amzdate + '\\n' +  credential_scope + '\\n' +  hashlib.sha256(canonical_request.encode('utf-8')).hexdigest()
kDate = hmac.new(('AWS4' + secret_key).encode('utf-8'), datestamp.encode('utf-8'), hashlib.sha256).digest()
kRegion = hmac.new(kDate, region.encode('utf-8'), hashlib.sha256).digest()
kService = hmac.new(kRegion, service.encode('utf-8'), hashlib.sha256).digest()
kSigning = hmac.new(kService, 'aws4_request'.encode('utf-8'), hashlib.sha256).digest()
signing_key = kSigning
signature = hmac.new(signing_key, (string_to_sign).encode('utf-8'), hashlib.sha256).hexdigest()
authorization_header = algorithm + ' ' + 'Credential=' + access_key + '/' + credential_scope + ', ' +  'SignedHeaders=' + signed_headers + ', ' + 'Signature=' + signature

Oracle Netsuite

Source


import datetime
import random
import string
import hashlib
import base64
import hmac
import urllib
oauth_consumer_id = '{{consumer_id}}'               # Provide Values
oauth_consumer_key = '{{consumer_key}}'             # Provide Values
oauth_consumer_secret ='{{consumer_secret}}'        # Provide Values
oauth_token ='{{token}}'                            # Provide Values
oauth_token_secret ='{{token_secret}}'              # Provide Values
###########################################################################################################################################################
request_method = 'POST'
url = 'https://{{hostname}}.suitetalk.api.netsuite.com/services/rest/query/v1/suiteql'
oauth_signature_method = 'HMAC-SHA256'
oauth_timestamp = str(int(datetime.datetime.now().timestamp()))
oauth_nonce = ''.join(random.choices(string.ascii_letters + string.digits, k = 11))
oauth_version = '1.0'
normalized_request_method = request_method.replace(' ', '')
normalized_string_url = urllib.parse.quote(url, safe = '')
normalized_params = {'oauth_consumer_key': oauth_consumer_key,'oauth_token': oauth_token,'oauth_signature_method': oauth_signature_method,'oauth_timestamp': oauth_timestamp,'oauth_nonce': oauth_nonce,'oauth_version': oauth_version,'limit':max_num,'offset':min_num}
sorted_params = dict(sorted(normalized_params.items()))
normalized_string_parmas = [k+'='+v for k,v in sorted_params.items()]
normalized_string_parmas = '&'.join([str(elem) for elem in normalized_string_parmas])
normalized_string_parmas.replace(' ','')
normalized_string_parmas = urllib.parse.quote(normalized_string_parmas, safe = '')
base_string = request_method + '&' + normalized_string_url + '&' + normalized_string_parmas
base_string = str.encode(base_string)
signature_key = oauth_consumer_secret + '&' + oauth_token_secret
signature_key = str.encode(signature_key)
oauth_signature = hmac.new(signature_key, base_string, hashlib.sha256)
oauth_signature.hexdigest()
oauth_signature = base64.b64encode(oauth_signature.digest())
oauth_signature = oauth_signature.decode('UTF-8')
oauth_signature = urllib.parse.quote(oauth_signature, safe = '')
signature ='OAuth realm="'f'{oauth_consumer_id}",oauth_consumer_key="'f'{oauth_consumer_key}",oauth_token="'f'{oauth_token}",oauth_signature_method="'f'{oauth_signature_method}",oauth_timestamp="'f'{oauth_timestamp}",oauth_nonce="'f'{oauth_nonce}",oauth_version="'f'{oauth_version}",oauth_signature="'f'{oauth_signature}"'

Target / Operations


import datetime
import random
import string
import hashlib
import base64
import hmac
import urllib.parse
oauth_consumer_id = '{{consumer_id}}'               # Provide Values
oauth_consumer_key = '{{consumer_key}}'             # Provide Values
oauth_consumer_secret ='{{consumer_secret}}'        # Provide Values
oauth_token ='{{token}}'                            # Provide Values
oauth_token_secret ='{{token_secret}}'              # Provide Values
###########################################################################################################################################################
request_method = 'POST'
url = 'https://{{account_id}}.suitetalk.api.netsuite.com/services/rest/query/v1/suiteql'
oauth_signature_method = 'HMAC-SHA256'
oauth_timestamp = str(int(datetime.datetime.now().timestamp()))
oauth_nonce = ''.join(random.choices(string.ascii_letters + string.digits, k = 11))
oauth_version = '1.0'
normalized_request_method = request_method.replace(' ', '')
normalized_string_url = urllib.parse.quote(url, safe = '')
normalized_params = {'oauth_consumer_key': oauth_consumer_key,'oauth_token': oauth_token,'oauth_signature_method': oauth_signature_method,'oauth_timestamp': oauth_timestamp,'oauth_nonce': oauth_nonce,'oauth_version': oauth_version}
sorted_params = dict(sorted(normalized_params.items()))
normalized_string_parmas = [k+'='+v for k,v in sorted_params.items()]
normalized_string_parmas = '&'.join([str(elem) for elem in normalized_string_parmas])
normalized_string_parmas.replace(' ','')
normalized_string_parmas = urllib.parse.quote(normalized_string_parmas, safe = '')
base_string = request_method + '&' + normalized_string_url + '&' + normalized_string_parmas
base_string = str.encode(base_string)
signature_key = oauth_consumer_secret + '&' + oauth_token_secret
signature_key = str.encode(signature_key)
oauth_signature = hmac.new(signature_key, base_string, hashlib.sha256)
oauth_signature.hexdigest()
oauth_signature = base64.b64encode(oauth_signature.digest())
oauth_signature = oauth_signature.decode('UTF-8')
oauth_signature = urllib.parse.quote(oauth_signature, safe = '')
signature ='OAuth realm="'f'{oauth_consumer_id}",oauth_consumer_key="'f'{oauth_consumer_key}",oauth_token="'f'{oauth_token}",oauth_signature_method="'f'{oauth_signature_method}",oauth_timestamp="'f'{oauth_timestamp}",oauth_nonce="'f'{oauth_nonce}",oauth_version="'f'{oauth_version}",oauth_signature="'f'{oauth_signature}"'

 

Note : In Oracle NetSuite Target/Operations pre-request script, users can select either the “POST” or “PATCH” method based on their requirements. Please specify the chosen method in the request_method parameter within the pre-request script.

Target / Operations for SOAP body


import os
import requests
import time
import hashlib
import hmac
import base64
import secrets
​
# Set your environment variables (replace with your actual credentials)
account = '{{account}}'
consumerKey = '{{consumerKey}}'
consumerSecret = '{{consumerSecret}}'
tokenId = '{{tokenId}}'
tokenSecret = '{{tokenSecret}}'
​
# Generate timestamp and nonce
timestamp = str(int(time.time()))
nonce = secrets.token_hex(11)
​
# Create base string
baseString = f"{account}&{consumerKey}&{tokenId}&{nonce}&{timestamp}"
​
# Create key
key = f"{consumerSecret}&{tokenSecret}"
​
# Create signature
signature = base64.b64encode(hmac.new(key.encode('utf-8'), baseString.encode('utf-8'), hashlib.sha256).digest()).decode('utf-8')

Azure Cosmos DB

Source & Target / Operations


from wsgiref.handlers import format_date_time
from datetime import datetime
from time import mktime
import base64
from urllib.parse import quote
import hmac
from hashlib import sha256
endpoint_url='{{hostname}}'                     # Provide Values
master_key = '{{master_key}}'                   # Provide Values
resource_type = '{{resource_type}}'             # Provide Values
resource_id = '{{resource_id}}'                 # Provide Values
################################################################################################################################################################################
key = base64.b64decode(master_key)
endpoint_method = 'post'
now = datetime.now()
stamp = mktime(now.timetuple())
date = format_date_time(stamp)
text = '{endpoint_method}\\n{resource_type}\\n{resource_id}\\n{date}\\n{other}\\n'.format(endpoint_method=(endpoint_method.lower() or ''),resource_type=(resource_type.lower() or ''),resource_id=(resource_id or ''),date=date.lower(),other=''.lower())
body = text.encode('utf-8')
digest = hmac.new(key, body, sha256).digest()
signature = base64.encodebytes(digest).decode('utf-8')
key_type = 'master'
version = '1.0'
uri = f'type={key_type}&ver={version}&sig={signature[:-1]}'
authorization = quote(uri)

Amazon S3

Method : PUT

Note: This pre_request_script is for loading data to Amazon S3 Bucket.


import hashlib
import hmac
import datetime
access_key = '{{access_key}}'         # Provide Values
secret_key = '{{secret_key}}'         # Provide Values
bucket = '{{bucket_name}}'            # Provide Values
region = '{{region}}'                 # Provide Values
payload = '''{{payload}}'''           # Provide Values
host = '{{host}}'                     # Provide Values
canonical_uri = '/{{canonical_uri}}'  # Provide Values
################################################################################################################################################################################
method = 'PUT'
amzdate = datetime.datetime.utcnow().strftime('%Y%m%dT%H%M%SZ')
datestamp = datetime.datetime.utcnow().strftime('%Y%m%d')
canonical_querystring = ''
payload_hash = hashlib.sha256(payload.encode()).hexdigest()
canonical_headers = 'host:' + host + '\\n' + 'x-amz-content-sha256:' + payload_hash + '\\n' + 'x-amz-date:' + amzdate + '\\n'
signed_headers = 'host;x-amz-content-sha256;x-amz-date'
canonical_request = method + '\\n' + canonical_uri + '\\n' + canonical_querystring + '\\n' + canonical_headers + '\\n' + signed_headers + '\\n' + hashlib.sha256(payload.encode()).hexdigest()
algorithm = 'AWS4-HMAC-SHA256'
credential_scope = datestamp + '/' + region + '/s3/aws4_request'
string_to_sign = algorithm + '\\n' + amzdate + '\\n' + credential_scope + '\\n' + hashlib.sha256(canonical_request.encode()).hexdigest()
date_key = hmac.new(('AWS4' + secret_key).encode(), datestamp.encode(), hashlib.sha256).digest()
region_key = hmac.new(date_key, region.encode(), hashlib.sha256).digest()
service_key = hmac.new(region_key, 's3'.encode(), hashlib.sha256).digest()
signing_key = hmac.new(service_key, 'aws4_request'.encode(), hashlib.sha256).digest()
signature = hmac.new(signing_key, string_to_sign.encode(), hashlib.sha256).hexdigest()
authorization_header = algorithm + ' Credential=' + access_key + '/' + credential_scope + ', SignedHeaders=' + signed_headers + ', Signature=' + signature

Method : GET

Note: This pre_request_script is for retrieving data from Amazon S3 Bucket.


import hashlib
import hmac
import datetime
access_key = '{{access_key}}'         # Provide Values
secret_key = '{{secret_key}}'         # Provide Values
bucket = '{{bucket_name}}'            # Provide Values
region = '{{region}}'                 # Provide Values
host = '{{host}}'                     # Provide Values
canonical_uri = '/{{canonical_uri}}'  # Provide Values
##################################################################################################################################################################################
method = 'GET'
service = 's3'
t = datetime.datetime.utcnow()
amzdate = t.strftime('%Y%m%dT%H%M%SZ')
datestamp = t.strftime(‘%Y%m%d’)
canonical_querystring = ''
canonical_headers = 'host:' + host + '\n' + 'x-amz-date:' + amzdate + '\n'
signed_headers = 'host;x-amz-date'
payload_hash = hashlib.sha256(('').encode('utf-8')).hexdigest()
canonical_request = method + '\n' + canonical_uri + '\n' + canonical_querystring + '\n' + canonical_headers + '\n' + signed_headers + '\n' + payload_hash
algorithm = 'AWS4-HMAC-SHA256'
credential_scope = datestamp + '/' + region + '/'+ service + '/' + 'aws4_request'
string_to_sign = algorithm + '\n' +  amzdate + '\n' +  credential_scope + '\n' +  hashlib.sha256(canonical_request.encode('utf-8')).hexdigest()
date_key = hmac.new(("AWS4" + secret_key).encode(), datestamp.encode(), hashlib.sha256).digest()
region_key = hmac.new(date_key, region.encode(), hashlib.sha256).digest()
service_key = hmac.new(region_key, "s3".encode(), hashlib.sha256).digest()
signing_key = hmac.new(service_key, "aws4_request".encode(), hashlib.sha256).digest()
signature = hmac.new(signing_key, (string_to_sign).encode('utf-8'), hashlib.sha256).hexdigest()
authorization_header = algorithm + ' ' + 'Credential=' + access_key + '/' + credential_scope + ', ' +  'SignedHeaders=' + signed_headers + ', ' + 'Signature=' + signature

Datalake Search

Description

DataLake Search helps to search for records in Bizintel360 datalake of the given index name.

Number of parameters: 5

DataLake Version
There are the two types of Datalake Version:
Neptune Datalake
Pluto Datalake

Index/Table Name
Enter the Index or Table name from where data is required to be retrieved data from the Datalake.

Pagination Wait Time:
By default it is set to 2m, where m is minute. Pagination is a standard API capability. Bizintel360 Data Lake source retrieve data in paginated way. This parameter is to set the tome required to wait for next page. So pagination focuses to retrieve data in small chunks with low waiting time. Pagination is also use to retrieve data rapidly with low waiting time.

Timeout:
By default it is `2m`, where m is minute. In general `2m` is high enough to get response from Bizintel360 Data Lake. Increase this when response from Bizintel360 DataLake is slow. This can happen when the Data Lake Cluster size is small. Reach out to Bizdata support team to make a increase in cluster size of Bizintel360 Data Lake.

 

Updated on December 29, 2025

What are your Feelings

  • Happy
  • Normal
  • Sad

Share This Article :

  • Facebook
  • X
  • LinkedIn
  • Pinterest
Deep LearningData Pipeline Controls
Table of Contents
  • Pre-Request Script of Python
  • Pre-Request Script of Python
  • Amazon SP API
    • Method - POST
    • Method - POST (with Body)
    • Method - GET
  • Oracle Netsuite
    • Source
    • Target / Operations
    •  
  • Azure Cosmos DB
    • Source & Target / Operations
  • Amazon S3
    • Method : PUT
    • Method : GET
© Copyright 2026 Bizdata Inc. | All Rights Reserved | Terms of Use | Privacy Policy