Azure DataForge Workspace Setup Requirements

If you do not have already have a Databricks Workspace, signup for a DataForge trial for easy setup.

To set up a new DataForge Workspace using your existing Azure Databricks Workspace, you will need:

  • An Azure Data Lake Storage Gen2 account with at least one container
  • A mount set up in the Databricks workspace that has access to the container in the ADLS Gen2 account
  • An App Registration in your Azure subscription that has Contributor permission and the following API permissions. These permissions are found in App Registration -> API Permissions -> Add a Permission -> Microsoft Graph.
    • Application.ReadWrite.All
    • Directory.ReadWrite.All
  • Open the Quotas page in Azure Portal, filter the Region for the region you will use for your Databricks and DataForge environment, and request a quota increase for the following Quotas.  Quota increases do not increase your cost.  Lower quotas means fewer jobs can run at the same time while using DataForge and Databricks and may result in job failures.
    • Total Regional vCPUs - increase to 100 (15 is bare minimum)
    • Standard DSv3 Family CPUs - increase to 100 (15 is bare minimum)

Updated

Was this article helpful?

0 out of 0 found this helpful