Lakecode reads conventions files to understand your team’s standards and preferences. There are two levels:
~/.lakecode/conventions.md applies to all projects on your machine..lakecode/conventions.md in your project root applies only to that project. Project conventions override global ones.Conventions are written in plain Markdown. Lakecode reads them before every command to tailor its behavior. Example:
# Team Conventions
## Default catalog and schema
Always use `analytics.prod` as the default catalog.schema unless specified.
## Naming patterns
- Tables: snake_case, prefixed with source (e.g., stripe_charges)
- Jobs: kebab-case with team prefix (e.g., data-eng-daily-refresh)
## Preferred warehouse
Use the `analytics-medium` SQL warehouse for ad-hoc queries.
## Standards
- All tables must have a `_loaded_at` timestamp column
- Partition by date for tables over 1B rows
- Tag all jobs with team name and SLA tier
Lakecode ships with a library of platform-specific skills (Unity Catalog, Delta Lake, Snowflake Streams, Tasks, Cortex, and more). When you run a command, Lakecode detects relevant keywords and injects the appropriate skills into the AI context. This means the AI always has deep, current knowledge about the platform features involved in your query — without you configuring anything.
Lakecode uses YAML config files at two levels:
~/.lakecode/config.yml applies to all projects../lakecode.yml in your project root. Project config overrides global.You can also set LAKECODE_CONFIG_PATH to point to an external config file.
# lakecode.yml
platform: databricks # or "snowflake"
databricks:
profile: DEFAULT # Databricks CLI profile name
default_catalog: analytics
default_schema: prod
snowflake:
connection: default # connection name from ~/.snowflake/config.toml
warehouse: ANALYTICS_WH
database: PROD
role: DATA_ENG
Lakecode uses standard environment variables for workspace connectivity. Environment variables take precedence over config files.
DATABRICKS_HOST — Your workspace URL (e.g., https://dbc-abc123.cloud.databricks.com)DATABRICKS_TOKEN — A personal access token for your workspaceLAKECODE_PROFILE — Databricks CLI profile name (default: DEFAULT)LAKECODE_TARGET — Target workspace for bundle operationsLAKECODE_SNOWFLAKE_CONNECTION — Connection name from ~/.snowflake/config.toml (default: default)LAKECODE_SNOWFLAKE_ACCOUNT — Your Snowflake account identifierLAKECODE_PLATFORM — Platform: databricks or snowflakeANTHROPIC_API_KEY — (Optional) Your own Anthropic key for BYOK modeIf you have a Databricks CLI profile (~/.databrickscfg) or Snowflake connection (~/.snowflake/config.toml), Lakecode will use it automatically.