backnotprop tmccoy14 commited on
Commit
809c27b
1 Parent(s): b82e778

Update frameworks/dasf/controls.json (#2)

Browse files

- Update frameworks/dasf/controls.json (8662843116c16af299b78039531f74b4c50daf08)


Co-authored-by: Tucker McCoy <tmccoy14@users.noreply.huggingface.co>

Files changed (1) hide show
  1. frameworks/dasf/controls.json +31 -31
frameworks/dasf/controls.json CHANGED
@@ -5,7 +5,7 @@
5
  "description": "Implementing single sign-on with an identity provider\u2019s (IdP) multi-factor authentication is critical for secure authentication. It adds an extra layer of security, ensuring that only authorized users access the Databricks Platform.",
6
  "controlCategory": "Configuration",
7
  "readableControlId": "DASF 1",
8
- "severity": "medium",
9
  "automationPlatforms": ["azure_databricks"]
10
  },
11
  {
@@ -14,7 +14,7 @@
14
  "description": "Synchronizing users and groups from your identity provider (IdP) with Databricks using the SCIM standard facilitates consistent and automated user provisioning for enhancing security.",
15
  "controlCategory": "Configuration",
16
  "readableControlId": "DASF 2",
17
- "severity": "medium",
18
  "automationPlatforms": ["azure_databricks"]
19
  },
20
  {
@@ -32,7 +32,7 @@
32
  "description": "Use AWS PrivateLink, Azure Private Link or GCP Private Service Connect to create a private network route between the customer and the Databricks control plane or the control plane and the customer\u2019s compute plane environments to enhance data security by avoiding public internet exposure.",
33
  "controlCategory": "Configuration",
34
  "readableControlId": "DASF 4",
35
- "severity": "medium",
36
  "automationPlatforms": ["azure_databricks"]
37
  },
38
  {
@@ -59,7 +59,7 @@
59
  "description": "Databricks Delta Live Tables (DLT) simplifies ETL development with declarative pipelines that integrate quality control checks and performance monitoring.",
60
  "controlCategory": "Implementation",
61
  "readableControlId": "DASF 7",
62
- "severity": "medium",
63
  "automationPlatforms": ["azure_databricks"]
64
  },
65
  {
@@ -68,7 +68,7 @@
68
  "description": "Databricks supports customer-managed encryption keys to strengthen data at rest protection and greater access control.",
69
  "controlCategory": "Configuration",
70
  "readableControlId": "DASF 8",
71
- "severity": "medium",
72
  "automationPlatforms": ["azure_databricks"]
73
  },
74
  {
@@ -77,7 +77,7 @@
77
  "description": "Databricks supports TLS 1.2+ encryption to protect customer data during transit. This applies to data transfer between the customer and the Databricks control plane and within the compute plane. Customers can also secure inter-cluster communications within the compute plane per their security requirements.",
78
  "controlCategory": "Out-of-the-box",
79
  "readableControlId": "DASF 9",
80
- "severity": "medium",
81
  "automationPlatforms": ["azure_databricks"]
82
  },
83
  {
@@ -86,7 +86,7 @@
86
  "description": "Store data in a lakehouse architecture using Delta tables. Delta tables can be versioned to revert any user\u2019s or malicious actor\u2019s poisoning of data. Data can be stored in a lakehouse architecture in the customer\u2019s cloud account. Both raw data and feature tables are stored as Delta tables with access controls to determine who can read and modify them. Data lineage with UC helps track and audit changes and the origin of ML data sources. Each operation that modifies a Delta Lake table creates a new table version. User actions are tracked and audited, and lineage of transformations is available all in the same platform. You can use history information to audit operations, roll back a table or query a table at a specific point in time using time travel.",
87
  "controlCategory": "Implementation",
88
  "readableControlId": "DASF 10",
89
- "severity": "medium",
90
  "automationPlatforms": ["azure_databricks"]
91
  },
92
  {
@@ -95,7 +95,7 @@
95
  "description": "Unity Catalog tracks and visualizes real-time data lineage across all languages to the column level, providing a traceable history of an object from notebooks, workflows, models and dashboards. This enhances transparency and compliance, with accessibility provided through the Catalog Explorer.",
96
  "controlCategory": "Out-of-the-box",
97
  "readableControlId": "DASF 11",
98
- "severity": "medium",
99
  "automationPlatforms": ["azure_databricks"]
100
  },
101
  {
@@ -122,7 +122,7 @@
122
  "description": "Databricks auditing, enhanced by Unity Catalog\u2019s events, delivers fine-grained visibility into data access and user activities. This is vital for robust data governance and security, especially in regulated industries. It enables organizations to proactively identify and manage overentitled users, enhancing data security and ensuring compliance.",
123
  "controlCategory": "Implementation",
124
  "readableControlId": "DASF 14",
125
- "severity": "medium",
126
  "automationPlatforms": ["azure_databricks"]
127
  },
128
  {
@@ -131,7 +131,7 @@
131
  "description": "Iteratively explore, share and prep data for the machine learning lifecycle by creating reproducible, editable and shareable datasets, tables and visualizations. Within Databricks this EDA process can be accelerated with Mosaic AI AutoML. AutoML not only generates baseline models given a dataset, but also provides the underlying model training code in the form of a Python notebook. Notably for EDA, AutoML calculates summary statistics on the provided dataset, creating a notebook for the data scientist to review and adapt.",
132
  "controlCategory": "Implementation",
133
  "readableControlId": "DASF 15",
134
- "severity": "medium",
135
  "automationPlatforms": ["azure_databricks"]
136
  },
137
  {
@@ -140,7 +140,7 @@
140
  "description": "Databricks Feature Store is a centralized repository that enables data scientists to find and share features and also ensures that the same code used to compute the feature values is used for model training and inference. Unity Catalog\u2019s capabilities, such as security, lineage, table history, tagging and cross-workspace access, are automatically available to the feature table to reduce the risk of malicious actors manipulating the features that feed into ML training.",
141
  "controlCategory": "Implementation",
142
  "readableControlId": "DASF 16",
143
- "severity": "medium",
144
  "automationPlatforms": ["azure_databricks"]
145
  },
146
  {
@@ -167,7 +167,7 @@
167
  "description": "Databricks includes a managed version of MLflow featuring enterprise security controls and high availability. It supports functionalities like experiments, run management and notebook revision capture. MLflow on Databricks allows tracking and measuring machine learning model training runs, logging model training artifacts and securing machine learning projects.",
168
  "controlCategory": "Implementation",
169
  "readableControlId": "DASF 19",
170
- "severity": "medium",
171
  "automationPlatforms": ["azure_databricks"]
172
  },
173
  {
@@ -185,7 +185,7 @@
185
  "description": "Databricks Lakehouse Monitoring offers a single pane of glass to centrally track tables\u2019 data quality and statistical properties and automatically classifies data. It can also track the performance of machine learning models and model serving endpoints by monitoring inference tables containing model inputs and predictions through a single pane of glass.",
186
  "controlCategory": "Implementation",
187
  "readableControlId": "DASF 21",
188
- "severity": "medium",
189
  "automationPlatforms": ["azure_databricks"]
190
  },
191
  {
@@ -194,7 +194,7 @@
194
  "description": "Harnessing internal data and intellectual property to customize large AI models can offer a significant competitive edge. However, this process can be complex, involving coordination across various parts of the organization. The Data Intelligence Platform addresses this challenge by integrating data across traditionally isolated departments and systems. This integration facilitates a more cohesive data and AI strategy, enabling the effective training, testing and evaluation of models using a comprehensive dataset. Use caution when preparing data for traditional models and GenAI training to ensure that you are not unintentionally including data that causes legal conflicts, such as copyright violations, privacy violations or HIPAA violations.",
195
  "controlCategory": "Implementation",
196
  "readableControlId": "DASF 22",
197
- "severity": "medium",
198
  "automationPlatforms": ["azure_databricks"]
199
  },
200
  {
@@ -203,7 +203,7 @@
203
  "description": "MLflow Model Registry supports managing the machine learning model lifecycle with capabilities for lineage tracking, versioning, staging and model serving.",
204
  "controlCategory": "Implementation",
205
  "readableControlId": "DASF 23",
206
- "severity": "medium",
207
  "automationPlatforms": ["azure_databricks"]
208
  },
209
  {
@@ -212,7 +212,7 @@
212
  "description": "Organizations commonly encounter challenges in tracking and controlling access to ML models, auditing their usage, and understanding their evolution in complex machine learning workflows. Unity Catalog integrates with the MLflow Model Registry across model lifecycles. This approach simplifies the management and oversight of ML models, proving particularly valuable in environments with multiple teams and diverse projects.",
213
  "controlCategory": "Implementation",
214
  "readableControlId": "DASF 24",
215
- "severity": "medium",
216
  "automationPlatforms": ["azure_databricks"]
217
  },
218
  {
@@ -239,7 +239,7 @@
239
  "description": "Data is your competitive advantage. Use it to customize large AI models to beat your competition by pretraining models with your data, imbuing the model with domain-specific knowledge, vocabulary and semantics. Pretrain your own LLM with MosaicML to own your IP.",
240
  "controlCategory": "Implementation",
241
  "readableControlId": "DASF 27",
242
- "severity": "medium",
243
  "automationPlatforms": ["azure_databricks"]
244
  },
245
  {
@@ -248,7 +248,7 @@
248
  "description": "Model aliases in machine learning workflows allow you to assign a mutable, named reference to a specific version of a registered model. This functionality is beneficial for tracking and managing different stages of a model\u2019s lifecycle, indicating the current deployment status of any given model version.",
249
  "controlCategory": "Implementation",
250
  "readableControlId": "DASF 28",
251
- "severity": "medium",
252
  "automationPlatforms": ["azure_databricks"]
253
  },
254
  {
@@ -257,7 +257,7 @@
257
  "description": "The lakehouse forms the foundation of a data-centric AI platform. Key to this is the ability to manage both data and AI assets from a unified governance solution on the lakehouse. Databricks Unity Catalog enables this by providing centralized access control, auditing, approvals, model workflow, lineage, and data discovery capabilities across Databricks workspaces. These benefits are now extended to MLflow Models with the introduction of Models in Unity Catalog. Through providing a hosted version of the MLflow Model Registry in Unity Catalog, the full lifecycle of an ML model can be managed while leveraging Unity Catalog\u2019s capability to share assets across Databricks workspaces and trace lineage across both data and models.",
258
  "controlCategory": "Implementation",
259
  "readableControlId": "DASF 29",
260
- "severity": "medium",
261
  "automationPlatforms": ["azure_databricks"]
262
  },
263
  {
@@ -284,7 +284,7 @@
284
  "description": "External models are third-party models hosted outside of Databricks. Supported by Model Serving AI Gateway, Databricks external models via the AI Gateway allow you to streamline the usage and management of various large language model (LLM) providers, such as OpenAI and Anthropic, within an organization. You can also use Mosaic AI Model Serving as a provider to serve predictive ML models, which offers rate limits for those endpoints. As part of this support, Model Serving offers a high-level interface that simplifies the interaction with these services by providing a unified endpoint to handle specific LLM-related requests. In addition, Databricks support for external models provides centralized credential management. By storing API keys in one secure location, organizations can enhance their security posture by minimizing the exposure of sensitive API keys throughout the system. It also helps to prevent exposing these keys within code or requiring end users to manage keys safely.",
285
  "controlCategory": "Out-of-the-box",
286
  "readableControlId": "DASF 32",
287
- "severity": "medium",
288
  "automationPlatforms": ["azure_databricks"]
289
  },
290
  {
@@ -293,7 +293,7 @@
293
  "description": "Databricks Secrets stores your credentials and references them in notebooks, scripts, configuration properties and jobs. Integrating with heterogeneous systems requires managing a potentially large set of credentials and safely distributing them across an organization. Instead of directly entering your credentials into a notebook, use Databricks Secrets to store your credentials and reference them in notebooks and jobs to prevent credential leaks through models. Databricks secret management allows users to use and share credentials within Databricks securely. You can also choose to use a third-party secret management service, such as AWS Secrets Manager or a third-party secret manager.",
294
  "controlCategory": "Implementation",
295
  "readableControlId": "DASF 33",
296
- "severity": "medium",
297
  "automationPlatforms": ["azure_databricks"]
298
  },
299
  {
@@ -302,7 +302,7 @@
302
  "description": "Databricks Serverless Compute provides a secure-by-design model serving service featuring defense-in-depth controls like dedicated VMs, network segmentation, and encryption for data in transit and at rest. It adheres to the principle of least privilege for enhanced security.",
303
  "controlCategory": "Out-of-the-box",
304
  "readableControlId": "DASF 34",
305
- "severity": "medium",
306
  "automationPlatforms": ["azure_databricks"]
307
  },
308
  {
@@ -311,7 +311,7 @@
311
  "description": "Databricks Lakehouse Monitoring provides performance metrics and data quality statistics across all account tables. It tracks the performance of machine learning models and model serving endpoints by observing inference tables with model inputs and predictions.",
312
  "controlCategory": "Implementation",
313
  "readableControlId": "DASF 35",
314
- "severity": "medium",
315
  "automationPlatforms": ["azure_databricks"]
316
  },
317
  {
@@ -347,7 +347,7 @@
347
  "description": "Databricks has established a formal incident response plan that outlines key elements such as roles, responsibilities, escalation paths and external communication protocols. The platform handles over 9TB of audit logs daily, aiding customer and Databricks security investigations. A dedicated security incident response team operates an internal Databricks instance, consolidating essential log sources for thorough security analysis. Databricks ensures continual operational readiness with a 24/7/365 on-call rotation. Additionally, a proactive hunting program and a specialized detection team support the incident response program.",
348
  "controlCategory": "Out-of-the-box",
349
  "readableControlId": "DASF 39",
350
- "severity": "medium",
351
  "automationPlatforms": ["azure_databricks"]
352
  },
353
  {
@@ -383,7 +383,7 @@
383
  "description": "Databricks access control lists (ACLs) enable you to configure permissions for accessing and interacting with workspace objects, including folders, notebooks, experiments, models, clusters, pools, jobs, Delta Live Tables pipelines, alerts, dashboards, queries and SQL warehouses.",
384
  "controlCategory": "Implementation",
385
  "readableControlId": "DASF 43",
386
- "severity": "medium",
387
  "automationPlatforms": ["azure_databricks"]
388
  },
389
  {
@@ -392,7 +392,7 @@
392
  "description": "Webhooks in the MLflow Model Registry enable you to automate machine learning workflow by triggering actions in response to specific events. These webhooks facilitate seamless integrations, allowing for the automatic execution of various processes. For example, webhooks are used for: CI workflow trigger (Validate your model automatically when creating a new version), Team notifications (Send alerts through a messaging app when a model stage transition request is received), Model fairness evaluation (Invoke a workflow to assess model fairness and bias upon a production transition request), and Automated deployment (Trigger a deployment pipeline when a new tag is created on a model).",
393
  "controlCategory": "Implementation",
394
  "readableControlId": "DASF 44",
395
- "severity": "medium",
396
  "automationPlatforms": ["azure_databricks"]
397
  },
398
  {
@@ -401,7 +401,7 @@
401
  "description": "Model evaluation is a critical component of the machine learning lifecycle. It provides data scientists with the tools to measure, interpret and explain the performance of their models. MLflow plays a critical role in accelerating model development by offering insights into the reasons behind a model's performance and guiding improvements and iterations. MLflow offers many industry-standard native evaluation metrics for classical machine learning algorithms and LLMs, and also facilitates the use of custom evaluation metrics.",
402
  "controlCategory": "Implementation",
403
  "readableControlId": "DASF 45",
404
- "severity": "medium",
405
  "automationPlatforms": ["azure_databricks"]
406
  },
407
  {
@@ -410,7 +410,7 @@
410
  "description": "Mosaic AI Vector Search is a vector database that is built into the Databricks Data Intelligence Platform and integrated with its governance and productivity tools. A vector database is a database that is optimized to store and retrieve embeddings. Embeddings are mathematical representations of the semantic content of data, typically text or image data. Embeddings are usually generated by feature extraction models for text, image, audio or multi-modal data, and are a key component of many GenAI applications that depend on finding documents or images that are similar to each other. Examples are RAG systems, recommender systems, and image and video recognition. Databricks implements the following security controls to protect your data: Every customer request to Vector Search is logically isolated, authenticated and authorized, and Mosaic AI Vector Search encrypts all data at rest (AES-256) and in transit (TLS 1.2+).",
411
  "controlCategory": "Implementation",
412
  "readableControlId": "DASF 46",
413
- "severity": "medium",
414
  "automationPlatforms": ["azure_databricks"]
415
  },
416
  {
@@ -446,7 +446,7 @@
446
  "description": "Develop your solutions on a platform created using some of the most rigorous security and compliance standards in the world. Get independent audit reports verifying that Databricks adheres to security controls for ISO 27001, ISO 27018, SOC 1, SOC 2, FedRAMP, HITRUST, IRAP, etc.",
447
  "controlCategory": "Out-of-the-box",
448
  "readableControlId": "DASF 50",
449
- "severity": "medium",
450
  "automationPlatforms": ["azure_databricks"]
451
  },
452
  {
@@ -455,7 +455,7 @@
455
  "description": "Databricks Delta Sharing lets you share data and AI assets securely in Databricks with users outside your organization, whether those users use Databricks or not.",
456
  "controlCategory": "Out-of-the-box",
457
  "readableControlId": "DASF 51",
458
- "severity": "medium",
459
  "automationPlatforms": ["azure_databricks"]
460
  },
461
  {
@@ -464,7 +464,7 @@
464
  "description": "Databricks' Git Repository integration supports effective code and third-party libraries management, enhancing customer control over their development environment.",
465
  "controlCategory": "Out-of-the-box",
466
  "readableControlId": "DASF 52",
467
- "severity": "medium",
468
  "automationPlatforms": ["azure_databricks"]
469
  },
470
  {
 
5
  "description": "Implementing single sign-on with an identity provider\u2019s (IdP) multi-factor authentication is critical for secure authentication. It adds an extra layer of security, ensuring that only authorized users access the Databricks Platform.",
6
  "controlCategory": "Configuration",
7
  "readableControlId": "DASF 1",
8
+ "severity": "high",
9
  "automationPlatforms": ["azure_databricks"]
10
  },
11
  {
 
14
  "description": "Synchronizing users and groups from your identity provider (IdP) with Databricks using the SCIM standard facilitates consistent and automated user provisioning for enhancing security.",
15
  "controlCategory": "Configuration",
16
  "readableControlId": "DASF 2",
17
+ "severity": "low",
18
  "automationPlatforms": ["azure_databricks"]
19
  },
20
  {
 
32
  "description": "Use AWS PrivateLink, Azure Private Link or GCP Private Service Connect to create a private network route between the customer and the Databricks control plane or the control plane and the customer\u2019s compute plane environments to enhance data security by avoiding public internet exposure.",
33
  "controlCategory": "Configuration",
34
  "readableControlId": "DASF 4",
35
+ "severity": "high",
36
  "automationPlatforms": ["azure_databricks"]
37
  },
38
  {
 
59
  "description": "Databricks Delta Live Tables (DLT) simplifies ETL development with declarative pipelines that integrate quality control checks and performance monitoring.",
60
  "controlCategory": "Implementation",
61
  "readableControlId": "DASF 7",
62
+ "severity": "low",
63
  "automationPlatforms": ["azure_databricks"]
64
  },
65
  {
 
68
  "description": "Databricks supports customer-managed encryption keys to strengthen data at rest protection and greater access control.",
69
  "controlCategory": "Configuration",
70
  "readableControlId": "DASF 8",
71
+ "severity": "high",
72
  "automationPlatforms": ["azure_databricks"]
73
  },
74
  {
 
77
  "description": "Databricks supports TLS 1.2+ encryption to protect customer data during transit. This applies to data transfer between the customer and the Databricks control plane and within the compute plane. Customers can also secure inter-cluster communications within the compute plane per their security requirements.",
78
  "controlCategory": "Out-of-the-box",
79
  "readableControlId": "DASF 9",
80
+ "severity": "low",
81
  "automationPlatforms": ["azure_databricks"]
82
  },
83
  {
 
86
  "description": "Store data in a lakehouse architecture using Delta tables. Delta tables can be versioned to revert any user\u2019s or malicious actor\u2019s poisoning of data. Data can be stored in a lakehouse architecture in the customer\u2019s cloud account. Both raw data and feature tables are stored as Delta tables with access controls to determine who can read and modify them. Data lineage with UC helps track and audit changes and the origin of ML data sources. Each operation that modifies a Delta Lake table creates a new table version. User actions are tracked and audited, and lineage of transformations is available all in the same platform. You can use history information to audit operations, roll back a table or query a table at a specific point in time using time travel.",
87
  "controlCategory": "Implementation",
88
  "readableControlId": "DASF 10",
89
+ "severity": "high",
90
  "automationPlatforms": ["azure_databricks"]
91
  },
92
  {
 
95
  "description": "Unity Catalog tracks and visualizes real-time data lineage across all languages to the column level, providing a traceable history of an object from notebooks, workflows, models and dashboards. This enhances transparency and compliance, with accessibility provided through the Catalog Explorer.",
96
  "controlCategory": "Out-of-the-box",
97
  "readableControlId": "DASF 11",
98
+ "severity": "high",
99
  "automationPlatforms": ["azure_databricks"]
100
  },
101
  {
 
122
  "description": "Databricks auditing, enhanced by Unity Catalog\u2019s events, delivers fine-grained visibility into data access and user activities. This is vital for robust data governance and security, especially in regulated industries. It enables organizations to proactively identify and manage overentitled users, enhancing data security and ensuring compliance.",
123
  "controlCategory": "Implementation",
124
  "readableControlId": "DASF 14",
125
+ "severity": "high",
126
  "automationPlatforms": ["azure_databricks"]
127
  },
128
  {
 
131
  "description": "Iteratively explore, share and prep data for the machine learning lifecycle by creating reproducible, editable and shareable datasets, tables and visualizations. Within Databricks this EDA process can be accelerated with Mosaic AI AutoML. AutoML not only generates baseline models given a dataset, but also provides the underlying model training code in the form of a Python notebook. Notably for EDA, AutoML calculates summary statistics on the provided dataset, creating a notebook for the data scientist to review and adapt.",
132
  "controlCategory": "Implementation",
133
  "readableControlId": "DASF 15",
134
+ "severity": "low",
135
  "automationPlatforms": ["azure_databricks"]
136
  },
137
  {
 
140
  "description": "Databricks Feature Store is a centralized repository that enables data scientists to find and share features and also ensures that the same code used to compute the feature values is used for model training and inference. Unity Catalog\u2019s capabilities, such as security, lineage, table history, tagging and cross-workspace access, are automatically available to the feature table to reduce the risk of malicious actors manipulating the features that feed into ML training.",
141
  "controlCategory": "Implementation",
142
  "readableControlId": "DASF 16",
143
+ "severity": "high",
144
  "automationPlatforms": ["azure_databricks"]
145
  },
146
  {
 
167
  "description": "Databricks includes a managed version of MLflow featuring enterprise security controls and high availability. It supports functionalities like experiments, run management and notebook revision capture. MLflow on Databricks allows tracking and measuring machine learning model training runs, logging model training artifacts and securing machine learning projects.",
168
  "controlCategory": "Implementation",
169
  "readableControlId": "DASF 19",
170
+ "severity": "high",
171
  "automationPlatforms": ["azure_databricks"]
172
  },
173
  {
 
185
  "description": "Databricks Lakehouse Monitoring offers a single pane of glass to centrally track tables\u2019 data quality and statistical properties and automatically classifies data. It can also track the performance of machine learning models and model serving endpoints by monitoring inference tables containing model inputs and predictions through a single pane of glass.",
186
  "controlCategory": "Implementation",
187
  "readableControlId": "DASF 21",
188
+ "severity": "high",
189
  "automationPlatforms": ["azure_databricks"]
190
  },
191
  {
 
194
  "description": "Harnessing internal data and intellectual property to customize large AI models can offer a significant competitive edge. However, this process can be complex, involving coordination across various parts of the organization. The Data Intelligence Platform addresses this challenge by integrating data across traditionally isolated departments and systems. This integration facilitates a more cohesive data and AI strategy, enabling the effective training, testing and evaluation of models using a comprehensive dataset. Use caution when preparing data for traditional models and GenAI training to ensure that you are not unintentionally including data that causes legal conflicts, such as copyright violations, privacy violations or HIPAA violations.",
195
  "controlCategory": "Implementation",
196
  "readableControlId": "DASF 22",
197
+ "severity": "low",
198
  "automationPlatforms": ["azure_databricks"]
199
  },
200
  {
 
203
  "description": "MLflow Model Registry supports managing the machine learning model lifecycle with capabilities for lineage tracking, versioning, staging and model serving.",
204
  "controlCategory": "Implementation",
205
  "readableControlId": "DASF 23",
206
+ "severity": "high",
207
  "automationPlatforms": ["azure_databricks"]
208
  },
209
  {
 
212
  "description": "Organizations commonly encounter challenges in tracking and controlling access to ML models, auditing their usage, and understanding their evolution in complex machine learning workflows. Unity Catalog integrates with the MLflow Model Registry across model lifecycles. This approach simplifies the management and oversight of ML models, proving particularly valuable in environments with multiple teams and diverse projects.",
213
  "controlCategory": "Implementation",
214
  "readableControlId": "DASF 24",
215
+ "severity": "low",
216
  "automationPlatforms": ["azure_databricks"]
217
  },
218
  {
 
239
  "description": "Data is your competitive advantage. Use it to customize large AI models to beat your competition by pretraining models with your data, imbuing the model with domain-specific knowledge, vocabulary and semantics. Pretrain your own LLM with MosaicML to own your IP.",
240
  "controlCategory": "Implementation",
241
  "readableControlId": "DASF 27",
242
+ "severity": "high",
243
  "automationPlatforms": ["azure_databricks"]
244
  },
245
  {
 
248
  "description": "Model aliases in machine learning workflows allow you to assign a mutable, named reference to a specific version of a registered model. This functionality is beneficial for tracking and managing different stages of a model\u2019s lifecycle, indicating the current deployment status of any given model version.",
249
  "controlCategory": "Implementation",
250
  "readableControlId": "DASF 28",
251
+ "severity": "high",
252
  "automationPlatforms": ["azure_databricks"]
253
  },
254
  {
 
257
  "description": "The lakehouse forms the foundation of a data-centric AI platform. Key to this is the ability to manage both data and AI assets from a unified governance solution on the lakehouse. Databricks Unity Catalog enables this by providing centralized access control, auditing, approvals, model workflow, lineage, and data discovery capabilities across Databricks workspaces. These benefits are now extended to MLflow Models with the introduction of Models in Unity Catalog. Through providing a hosted version of the MLflow Model Registry in Unity Catalog, the full lifecycle of an ML model can be managed while leveraging Unity Catalog\u2019s capability to share assets across Databricks workspaces and trace lineage across both data and models.",
258
  "controlCategory": "Implementation",
259
  "readableControlId": "DASF 29",
260
+ "severity": "high",
261
  "automationPlatforms": ["azure_databricks"]
262
  },
263
  {
 
284
  "description": "External models are third-party models hosted outside of Databricks. Supported by Model Serving AI Gateway, Databricks external models via the AI Gateway allow you to streamline the usage and management of various large language model (LLM) providers, such as OpenAI and Anthropic, within an organization. You can also use Mosaic AI Model Serving as a provider to serve predictive ML models, which offers rate limits for those endpoints. As part of this support, Model Serving offers a high-level interface that simplifies the interaction with these services by providing a unified endpoint to handle specific LLM-related requests. In addition, Databricks support for external models provides centralized credential management. By storing API keys in one secure location, organizations can enhance their security posture by minimizing the exposure of sensitive API keys throughout the system. It also helps to prevent exposing these keys within code or requiring end users to manage keys safely.",
285
  "controlCategory": "Out-of-the-box",
286
  "readableControlId": "DASF 32",
287
+ "severity": "low",
288
  "automationPlatforms": ["azure_databricks"]
289
  },
290
  {
 
293
  "description": "Databricks Secrets stores your credentials and references them in notebooks, scripts, configuration properties and jobs. Integrating with heterogeneous systems requires managing a potentially large set of credentials and safely distributing them across an organization. Instead of directly entering your credentials into a notebook, use Databricks Secrets to store your credentials and reference them in notebooks and jobs to prevent credential leaks through models. Databricks secret management allows users to use and share credentials within Databricks securely. You can also choose to use a third-party secret management service, such as AWS Secrets Manager or a third-party secret manager.",
294
  "controlCategory": "Implementation",
295
  "readableControlId": "DASF 33",
296
+ "severity": "low",
297
  "automationPlatforms": ["azure_databricks"]
298
  },
299
  {
 
302
  "description": "Databricks Serverless Compute provides a secure-by-design model serving service featuring defense-in-depth controls like dedicated VMs, network segmentation, and encryption for data in transit and at rest. It adheres to the principle of least privilege for enhanced security.",
303
  "controlCategory": "Out-of-the-box",
304
  "readableControlId": "DASF 34",
305
+ "severity": "low",
306
  "automationPlatforms": ["azure_databricks"]
307
  },
308
  {
 
311
  "description": "Databricks Lakehouse Monitoring provides performance metrics and data quality statistics across all account tables. It tracks the performance of machine learning models and model serving endpoints by observing inference tables with model inputs and predictions.",
312
  "controlCategory": "Implementation",
313
  "readableControlId": "DASF 35",
314
+ "severity": "high",
315
  "automationPlatforms": ["azure_databricks"]
316
  },
317
  {
 
347
  "description": "Databricks has established a formal incident response plan that outlines key elements such as roles, responsibilities, escalation paths and external communication protocols. The platform handles over 9TB of audit logs daily, aiding customer and Databricks security investigations. A dedicated security incident response team operates an internal Databricks instance, consolidating essential log sources for thorough security analysis. Databricks ensures continual operational readiness with a 24/7/365 on-call rotation. Additionally, a proactive hunting program and a specialized detection team support the incident response program.",
348
  "controlCategory": "Out-of-the-box",
349
  "readableControlId": "DASF 39",
350
+ "severity": "high",
351
  "automationPlatforms": ["azure_databricks"]
352
  },
353
  {
 
383
  "description": "Databricks access control lists (ACLs) enable you to configure permissions for accessing and interacting with workspace objects, including folders, notebooks, experiments, models, clusters, pools, jobs, Delta Live Tables pipelines, alerts, dashboards, queries and SQL warehouses.",
384
  "controlCategory": "Implementation",
385
  "readableControlId": "DASF 43",
386
+ "severity": "low",
387
  "automationPlatforms": ["azure_databricks"]
388
  },
389
  {
 
392
  "description": "Webhooks in the MLflow Model Registry enable you to automate machine learning workflow by triggering actions in response to specific events. These webhooks facilitate seamless integrations, allowing for the automatic execution of various processes. For example, webhooks are used for: CI workflow trigger (Validate your model automatically when creating a new version), Team notifications (Send alerts through a messaging app when a model stage transition request is received), Model fairness evaluation (Invoke a workflow to assess model fairness and bias upon a production transition request), and Automated deployment (Trigger a deployment pipeline when a new tag is created on a model).",
393
  "controlCategory": "Implementation",
394
  "readableControlId": "DASF 44",
395
+ "severity": "high",
396
  "automationPlatforms": ["azure_databricks"]
397
  },
398
  {
 
401
  "description": "Model evaluation is a critical component of the machine learning lifecycle. It provides data scientists with the tools to measure, interpret and explain the performance of their models. MLflow plays a critical role in accelerating model development by offering insights into the reasons behind a model's performance and guiding improvements and iterations. MLflow offers many industry-standard native evaluation metrics for classical machine learning algorithms and LLMs, and also facilitates the use of custom evaluation metrics.",
402
  "controlCategory": "Implementation",
403
  "readableControlId": "DASF 45",
404
+ "severity": "high",
405
  "automationPlatforms": ["azure_databricks"]
406
  },
407
  {
 
410
  "description": "Mosaic AI Vector Search is a vector database that is built into the Databricks Data Intelligence Platform and integrated with its governance and productivity tools. A vector database is a database that is optimized to store and retrieve embeddings. Embeddings are mathematical representations of the semantic content of data, typically text or image data. Embeddings are usually generated by feature extraction models for text, image, audio or multi-modal data, and are a key component of many GenAI applications that depend on finding documents or images that are similar to each other. Examples are RAG systems, recommender systems, and image and video recognition. Databricks implements the following security controls to protect your data: Every customer request to Vector Search is logically isolated, authenticated and authorized, and Mosaic AI Vector Search encrypts all data at rest (AES-256) and in transit (TLS 1.2+).",
411
  "controlCategory": "Implementation",
412
  "readableControlId": "DASF 46",
413
+ "severity": "high",
414
  "automationPlatforms": ["azure_databricks"]
415
  },
416
  {
 
446
  "description": "Develop your solutions on a platform created using some of the most rigorous security and compliance standards in the world. Get independent audit reports verifying that Databricks adheres to security controls for ISO 27001, ISO 27018, SOC 1, SOC 2, FedRAMP, HITRUST, IRAP, etc.",
447
  "controlCategory": "Out-of-the-box",
448
  "readableControlId": "DASF 50",
449
+ "severity": "low",
450
  "automationPlatforms": ["azure_databricks"]
451
  },
452
  {
 
455
  "description": "Databricks Delta Sharing lets you share data and AI assets securely in Databricks with users outside your organization, whether those users use Databricks or not.",
456
  "controlCategory": "Out-of-the-box",
457
  "readableControlId": "DASF 51",
458
+ "severity": "low",
459
  "automationPlatforms": ["azure_databricks"]
460
  },
461
  {
 
464
  "description": "Databricks' Git Repository integration supports effective code and third-party libraries management, enhancing customer control over their development environment.",
465
  "controlCategory": "Out-of-the-box",
466
  "readableControlId": "DASF 52",
467
+ "severity": "high",
468
  "automationPlatforms": ["azure_databricks"]
469
  },
470
  {