Quick Start Guide
This guide will get you up and running with the DASL client in just a few minutes. All examples in this guide are designed to run in Databricks notebooks and use the automatic authentication available in that environment. We’ll show you how to authenticate to DASL, as well as list, create, and update datasources and detection rules.
Prerequisites
DASL client installed (see Installation)
Access to a Databricks workspace with DASL enabled
A Databricks notebook environment (all examples assume you’re running in Databricks)
Basic familiarity with Python
Authentication
The DASL client uses automatic authentication in Databricks notebooks:
from dasl_client import Client
# Automatically detects Databricks context and authenticates
client = Client.for_workspace()
print("Connected to DASL!")
This method:
Works in any Databricks notebook
Uses your current workspace’s authentication context
Requires no additional configuration
Is the most secure option for notebook environments
Explore Your Workspace
Let’s start by exploring what’s already in your DASL workspace.
Note
Run the following code in a Databricks notebook cell:
# List all datasources in your workspace
print("Existing datasources:")
for datasource in client.list_datasources():
print(f" - {datasource.metadata.name}")
# List all rules
print("\nExisting rules:")
for rule in client.list_rules():
print(f" - {rule.metadata.name}")
# Check workspace configuration
config = client.get_config()
print(f"\nWorkspace: {config.metadata.workspace}")
print(f"Default SQL Warehouse: {config.default_sql_warehouse}")
Create Your First Datasource
Now let’s create a simple datasource using a preset:
from dasl_client import DataSource, Schedule, BronzeSpec
# Define a datasource for AWS Security Lake data
datasource = DataSource(
source="aws_sec_lake",
source_type="s3_data",
schedule=Schedule(at_least_every="24h", enabled=True),
use_preset="aws_sec_lake_s3_data",
bronze=BronzeSpec(
bronze_table="my_aws_data_bronze",
skip_bronze_loading=False
),
autoloader=DataSource.Autoloader(
location="s3://your-bucket/path/to/data/"
)
)
# Create the datasource
try:
created_ds = client.create_datasource("My AWS Data", datasource)
print(f"Successfully created datasource: {created_ds.metadata.name}")
except Exception as e:
print(f"Error creating datasource: {e}")
Retrieve and Update a Datasource
You can retrieve and modify datasources after creation:
# Get the datasource we just created
my_datasource = client.get_datasource("My AWS Data")
print(f"Retrieved: {my_datasource.metadata.name}")
print(f"Status: {my_datasource.status}")
# Update the schedule (this replaces the entire datasource)
my_datasource.schedule.at_least_every = "12h"
updated_ds = client.replace_datasource("My AWS Data", my_datasource)
print("Updated schedule to run every 12 hours")
Create a Detection Rule
Create a Detection Rule to detect blocked HTTP activity:
from dasl_client import Rule, Schedule
# Create a simple rule to detect failed logins
rule = Rule(
schedule=Schedule(
at_least_every="2h",
enabled=True,
),
input=Rule.Input(
stream=Rule.Input.Stream(
tables=[
Rule.Input.Stream.Table(name="http_activity"),
],
filter="disposition = 'Blocked'",
starting_timestamp=datetime(2025, 7, 8, 16, 47, 30),
),
),
output=Rule.Output(
summary="record was blocked",
),
)
try:
created_rule = client.create_rule("Detect Blocked HTTP Activity", rule)
print(f"Successfully created rule: {created_rule.metadata.name}")
except Exception as e:
print(f"Error creating rule: {e}")
Retrieve and Update a Detection Rule
You can retrieve and modify rules after creation:
# Get the rule we just created
my_rule = client.get_rule("Detect Blocked HTTP Activity")
print(f"Retrieved: {my_rule.metadata.name}")
print(f"Status: {my_rule.status}")
# Update the rule content
my_rule.input.stream.filter = "disposition = 'Denied'"
# Update the rule (this replaces the entire rule)
updated_rule = client.replace_rule("Detect Blocked HTTP Activity", my_rule)
print("Updated rule")
Next Steps
Now you’re ready to dive deeper:
Master datasource management: Datasources
Explore rule creation: Rules
Browse the full API: API Reference