Skip to main content

Documentation Index

Fetch the complete documentation index at: https://domoinc-jkreitzman-patch-1.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

Intro

This article explains how to use Domo’s Databricks AWS Writeback Connector to export data from a Domo DataSet to a Databricks table, configure connection credentials, and specify the destination table for the writeback.

Prerequisites

To configure this connector, you must have the following:
  • The username and password you use to log into your Databricks host
  • The host name for the database
  • The port number for the database
  • The database name or schema name
  • The HTTP path
  • Your AWS S3 access key and secret key
  • The name of your S3 bucket
  • The AWS region where your S3 bucket is located
Note: The owner of a writeback DataSet must also be an owner or co-owner of the input DataSet.
Databricks is a cloud-based collaborative data science, data engineering, and data analytics platform that combines the best of data warehouses and data lakes into a lakehouse architecture. With the Databricks AWS Writeback Connector, you can export data from a Domo DataSet to a specified Databricks table. For more information about the Databricks API, see the Databricks API documentation. This article covers the fields and menus specific to the Databricks AWS Writeback Connector user interface. For general information about adding DataSets, setting update schedules, and editing DataSet information, see Add a DataSet Using a Data Connector.

Configure the Connection

This section describes the options in the Credentials and Details panes on the Databricks AWS Writeback Connector page. The components of the Scheduling and Name & Describe Your DataSet panes are universal across most connector types and are discussed in Add a DataSet Using a Data Connector.

Enter Your Credentials

The Credentials pane contains fields for entering credentials to connect to the Databricks account where you want your data to be copied. The following table describes what is needed for each field.
FieldDescription
HostEnter the host name for the Databricks database. For example: db.company.com.
PortEnter the port number for the Databricks database.
Database NameEnter the name of the Databricks database.
UsernameEnter your Databricks username.
PasswordEnter your Databricks password.
HTTP PathEnter the HTTP path.
AWS S3 Access KeyEnter your AWS access key.
AWS S3 Secret KeyEnter your AWS secret key.
AWS S3 BucketEnter the name of your S3 bucket.
AWS S3 RegionSelect the region where your S3 bucket is located.
For more information about obtaining these credentials, see Prerequisites above. After you have entered valid credentials, you can use the same account any time you set up a new Databricks-Domo connection. You can manage connector accounts in the Accounts tab in the Data Center. For more information about this tab, see Manage User Accounts for Connectors.

Configure the Details Pane

In the Details pane, you specify the input DataSet and the destination Databricks table.
MenuDescription
Input DataSet IDEnter your Domo DataSet ID (GUID) located in the DataSet URL. For example: https://customer.domo.com/datasources/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/details/settings.
Enter Databricks Table NameEnter the Databricks table name where you want to copy the data from your Domo DataSet.

Configure Other Options

For information about the remaining sections of the connector interface, including how to configure scheduling, retry, and update options, see Add a DataSet Using a Data Connector.

FAQ

You need the username, password, host name, port number, and database name of your Databricks database. You also need to provide the HTTP path, your AWS S3 access key and secret key, your S3 bucket name, and the AWS region where your S3 bucket is located.
You can find the host name, database, port number, and HTTP path by going to your cluster in Databricks and viewing the JDBC/ODBC tab in the Advanced section of the cluster details.
Limits depend on your server configuration.
As often as needed.
Your Domo input DataSet ID is in the URL of the DataSet you are exporting data from. For example: https://customer.domo.com/datasources/aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee/details/settings.
The Databricks table name is the table where you want to copy the data from your Domo DataSet.