Storing compute logs in Azure Blob Storage/Azure Data Lake Storage#
In this guide, we'll walk through how to store compute logs in Azure Blob Storage or Azure Data Lake Storage. This guide assumes you have already set up an Azure Kubernetes Service (AKS) agent and deployed user code in Azure Container Registry (ACR).
This guide focuses on using Azure Blob Storage, but the same steps should be applicable for Azure Data Lake Storage.
The Azure CLI installed on your machine. You can download it here.
An Azure account with the ability to create resources in Azure Blob Storage or Azure Data Lake Storage.
An Azure container in Azure Blob Storage or Azure Data Lake Storage where you want to store logs.
Either the quickstart_etl module from the hybrid quickstart repo, or any other code location successfully imported, which contains at least one asset or job that will generate logs for you to test against.
Step 1: Give AKS agent access to blob storage account#
We need to ensure that the AKS agent has the necessary permissions to write logs to Azure Blob Storage or Azure Data Lake Storage. We'll do this with some azure CLI commands.
First, we'll enable the cluster to use workload identity. This will allow the AKS agent to use a managed identity to access Azure resources.
az aks update --resource-group <resource-group> --name <cluster-name> --enable-workload-identity
Then, we'll create a new managed identity for the AKS agent, and a new service account in our AKS cluster.
If everything is set up correctly, you should be able to run the following command and see an access token returned:
kubectl exec -n dagster-agent -it <pod-in-cluster> -- bash# in the podcurl -H "Metadata:true""http://169.254.169.254/metadata/identity/oauth2/token?resource=https://storage.azure.com/"
Step 2: Configure Dagster to use Azure Blob Storage#
Now, you need to update the helm values to use Azure Blob Storage for logs. You can do this by editing the values.yaml file for your user-cloud deployment.
Pull down the current values for your deployment:
helm get values user-cloud > current-values.yaml
Then, edit the current-values.yaml file to include the following lines:
Step 3: Verify logs are being written to Azure Blob Storage#
It's time to kick off a run in Dagster to test your new configuration. If following along with the quickstart repo, you should be able to kick off a run of the all_assets_job, which will generate logs for you to test against. Otherwise, use any job that emits logs. When you go to the stdout/stderr window of the run page, you should see a log file that directs you to the Azure Blob Storage container.
Whether or not the URL will be clickable depends on whether your logs are public or private. If they are private, directly clicking the link would not work, and instead you should use either the Azure CLI or the Azure Portal to access the logs using the URL.