- Startup
- Business
- Enterprise
- On-Premise
- Add-on
Overview
Etlworks Azure Data Lake Storage connector allows the fastest and easiest way to connect to Gen1 and Gen2 Azure Data Lake storage. The connector works just like any other database connector.
Etlworks partnered with CData to provide access to Azure Data Lake Storage using industry-standard JDBC protocol. Read about CData Azure Data Lake Storage JDBC connector.
When to use Azure Data Lake Storage connector
Use this connector to create Flows that download files and upload files into Azure Data Lake.
Prerequisites
Enable the Azure Data Lake Storage connector for your Etlworks account. Contact support@etlworks.com
to enable the connector.
Create a Connection
You can create a Connection in just two steps, and they are the following:
Step 1. In the Connections
window, click +
, and type in azure data lake
.
Step 2. Enter the Connection parameters:
Account
: enter the storage account name.Storage Type
: select storage type (Gen1 or Gen2).
When connecting to Gen1 storage (default):
Azure Tenant
: enter Azure tenant id.Permissions
: select the permissions that will be used to access the Azure Data Lake Storage account.OAuth Token
:Sign in with Microsoft
.
Read about Authentication when connecting to Gen1 storage.
When connecting to Gen2 storage:
Access Key
: enter the access key for the account.File System
: enter the file system name which will be used for this account. For example, the name of an Azure Blob Container.
Read about Authentication when connecting to Gen2 storage.
Use Other Parameters
to specify the Connection string options. Read about available Connection string options.
Work with Azure Data Lake Storage
Azure Data Lake Data Model
- The connector models Azure Data Lake Storage entities like documents, folders, and groups as relational views, allowing you to write SQL to query Azure Data Lake Storage data.
- Stored procedures allow you to execute operations to Azure Data Lake Storage.
- Live connectivity to these objects means any changes to your Azure Data Lake Storage account are immediately reflected when using the driver.
Read about the:
Stored procedures
Stored Procedures are function-like interfaces to Azure Data Lake Storage. Stored procedures allow you to execute operations to Azure Data Lake Storage, including downloading documents and moving envelopes.
Read about:
To call stored procure from the SQL Flow or Before/After SQL, use EXEC sp_name params=value
.
Example:
EXECUTE my_proc @second = 2, @first = 1, @third = 3;
SQL Compliance
Read about SQL Compliance.
Create a folder in the data lake
Here are the steps on how you can create a folder in the data lake:
Step 1. Create Azure Data Lake storage Connection.
Step 2. Create a new SQL Flow.
Step 3. Select a Connection created in step 1.
Step 4. Select the Parameters
tab and enter the following SQL:
EXEC MakeDirectory Path='/path'
Where /path
is the location of data in data lake storage.
Download file in the data lake
Here are the steps on how you can download a file in the data lake:
Step 1. Create Azure Data Lake storage Connection.
Step 2. Create a new SQL Flow.
Step 3. Select a Connection created in step 1.
Step 4. Select the Parameters
tab and enter the following SQL:
EXEC DownloadFile WriteToFile='localpath/filename', Path='/path/filename'
Where localpath
is a location of the file in the local storage, for example, {app.data}
, the filename
is a file name to download and path
is an optional location of the file in the data lake storage.
Upload file to the data lake
Here are the steps on how you can upload a file to the data lake:
Step 1. Create Azure Data Lake storage Connection.
Step 2. Create a new SQL Flow.
Step 3. Select a Connection created in step 1.
Step 4. Select the Parameters
tab and enter the following SQL:
EXEC UploadFile FilePath='localpath/filename', Path='/path/filename', Overwrite='true'
Where localpath
is a location of the file in the local storage, for example, {app.data}
, the filename
is a file name to upload and path
is an optional location of the file in the data lake storage.
Comments
0 comments
Please sign in to leave a comment.