Create a destination for Microsoft Azure + Databricks

Last updated: Apr 13, 2026
HEALTH TECH VENDOR
IMPLEMENTATION

To populate your Microsoft Azure repository with healthcare data from an EHR system via Redox (and then to feed that data into Databricks for analytics), you must configure a specific Redox cloud destination. A Redox destination represents where a message is delivered (e.g., like the address in the “To” line of an email header). Learn more about connecting Redox to your cloud repository.

You’ll need to perform some steps in your cloud product(s) and some in Redox. You can perform Redox setup in our dashboard or with the Redox Platform API.

Prerequisites

  • Establish a connection with your preferred EHR system. Learn how to request a connection.
  • Complete your Azure (with Data Lake) and Databricks configuration before creating your Redox destination. Save any downloads with secret values, since you’ll need to enter some of these details into the Redox dashboard.
  • Grant access to Redox from Azure (and any other cloud product) to authorize Redox to push data to your cloud repository.

Configure in Microsoft Azure

  1. Navigate to the Microsoft Azure dashboard and log in. Review Azure’s quickstart guide to get started.
  2. Create an application through Microsoft Entra ID. Review Azure’s help article. This is where youll get a client ID and tenant ID, which youll need for Redox setup later.
  3. Create a new secret for your application. This is where youll get client secret value, which youll need for Redox setup later.
  4. Create a new storage account. Set the primary service to the Data Lake Storage option.
  5. Add a new container. Youll need the name of the container for Redox setup later.
  6. Assign the appropriate Blob Data role (i.e., must at least have write permissions) to the application you created in step #2.

Create a cloud destination in Redox

Next, create a cloud destination in your Redox organization. When the EHR system sends healthcare data to Redox, we push it on to your configured Azure + Databricks cloud destination.

In the dashboard

  1. From the Product type field, select Databricks.
  2. For the configure destination step, populate these fields. Then click the Next button.
    • Storage account name: Enter the name of the storage account you created in Azure. Locate this value in the Azure dashboard.
    • Container name: Enter the name of the container you created in Azure. Locate this value in the Azure container configuration.
    • File name prefix (optional): Enter any prefix you want prepended to new files when they’re created in the Data Lake container. Add / to put the files in a subdirectory. For example, redox/ puts all the files in the redox directory.
  3. For the auth credential step, either a drop-down list of existing auth credentials displays or a new auth credential form opens. Learn how to create an auth credential for OAuth 2.0 Two-legged.
    Token endpoint URL

    To avoid running into validation errors when sending data to your Azure cloud destination, your auth credential should have a token endpoint URL that matches this structure:

    https://login.microsoftonline.com/<tenant id from azure console step 2>/oauth2/v2.0/token

    Existing or new auth credential

    Your existing auth credentials will only display if they’re supported for the cloud product type you selected. If you don’t have any supported auth credentials for the cloud type in the current Redox environment, you’ll have to create a new auth credential.

With the Redox Platform API

  1. In your terminal, prepare the /v1/authcredentials request.
  2. Specify these values in the request.
    • Locate the clientId and clientSecret value in the Microsoft Azure dashboard.
      Example: Create auth credential for Azure + Databricks
      json
      1
      curl 'https://api.redoxengine.com/platform/v1/authcredentials' \
      2
      --request POST \
      3
      --header 'Authorization: Bearer $API_TOKEN' \
      4
      --header 'accept: application/json' \
      5
      --header 'content-type: application/json' \
      6
      --data '{
      7
      "organization": "<Redox_organization_id>"
      8
      "name": "<human_readable_name_for_auth_credential>"
      9
      "environmentId": "<Redox_environment_ID>"
      10
      "authStrategy": "OAuth_2.0_2-legged"
      11
      "url": "https://login.microsoftonline.com/<tenant id from azure console step 1>/oauth2/v2.0/token"
      12
      "grantType": "client_credentials"
      13
      "clientId": "<client_id_from_Azure>"
      14
      "keyId": "<client_secret_from_Azure>"
      15
      "scope": "https://storage.azure.com/.default"
      16
      }
  3. You should get a successful 200 response and a payload populated with the details of the new auth credential.
  4. In your terminal, prepare the /v1/environments/{environmentId}/destinations request.
  5. Specify these values in the request.
    • Set authCredential to the auth credential ID from the response you received in step #4.
    • Populate cloudProviderSettings with the settings below (adjust values based on the storage account and container setup in Azure configuration).
      • Enter databricks as the productId.
      • The fileNamePrefix is optional. If specified, the filename format will be the prefix you define appended by the file path in the Data Lake container. You can append / after the prefix name to indicate a directory path.
        Example: Values for Azure + Databricks cloudProviderSettings
        json
        1
        {
        2
        "cloudProviderSettings": {
        3
        "typeId": "azure",
        4
        "productId": "databricks",
        5
        "settings": {
        6
        "storageAccountName": "<storage_account_name_from_Azure>",
        7
        "containerName": "<container_name_from_Azure>",
        8
        "fileNamePrefix": "<optional_file_name_prefix>"
        9
        }
        10
        }
        11
        }
  6. You should get a successful 200 response with a payload populated with the details of the new Microsoft Azure + Databricks cloud destination. Specifically, the verified status of the destination should be set to true.
  7. Your new destination will now be able to receive messages. Redox pushes data to the Data Lake storage account as a JSON file, which is ingested into Microsoft Azure.

FHIR® is a registered trademark of Health Level Seven International (HL7) and is used with the permission of HL7. Use of this trademark does not constitute an endorsement of products/services by HL7®.