Skip to content

Commit bb3984a

Browse files
buraizujoepeeples
andauthored
Docs12187/storage management (#32612)
* [DOCS-12187] Remove storage management guide * [DOCS-12187] Move storage management images * [DOCS-12187] Add storage management section * [DOCS-12187] Add new storage management paths * [DOCS-12187] Update storage management link * [DOCS-12187] Add storage management pages * [DOCS-12187] Remove unnecessary aliases * [DOCS-12187] Remove preview callout * [DOCS-12187] Change setup steps to headings * [DOCS-12187] Convert numbered steps to collapsible sections * [DOCS-12187] Remove redundant sentence Removed redundant introductory text from the setup section. * Fix link * Apply suggestions from code review Co-authored-by: Joe Peeples <joe.peeples@datadoghq.com> * Update content/en/infrastructure/storage_management/_index.md Co-authored-by: Joe Peeples <joe.peeples@datadoghq.com> * [DOCS-12187] Re-delete storage monitoring guide --------- Co-authored-by: Joe Peeples <joe.peeples@datadoghq.com>
1 parent 87b029f commit bb3984a

File tree

16 files changed

+774
-687
lines changed

16 files changed

+774
-687
lines changed

config/_default/menus/main.en.yaml

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3692,6 +3692,27 @@ menu:
36923692
parent: network_path
36933693
identifier: network_path_guides
36943694
weight: 504
3695+
- name: Storage Management
3696+
url: /infrastructure/storage_management/
3697+
pre: file-wui
3698+
identifier: storage_management
3699+
parent: infrastructure_heading
3700+
weight: 90000
3701+
- name: Amazon S3
3702+
url: /infrastructure/storage_management/amazon_s3/
3703+
parent: storage_management
3704+
identifier: storage_management_amazon_s3
3705+
weight: 1
3706+
- name: Google Cloud Storage
3707+
url: /infrastructure/storage_management/google_cloud_storage/
3708+
parent: storage_management
3709+
identifier: storage_management_google_cloud_storage
3710+
weight: 2
3711+
- name: Azure Blob Storage
3712+
url: /infrastructure/storage_management/azure_blob_storage/
3713+
parent: storage_management
3714+
identifier: storage_management_azure_blob_storage
3715+
weight: 3
36953716
- name: Cloud Cost
36963717
url: cloud_cost_management/
36973718
identifier: cloud_cost
Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
---
2+
title: Storage Management
3+
further_reading:
4+
- link: "https://www.datadoghq.com/blog/datadog-storage-monitoring/"
5+
tag: "Blog"
6+
text: "Optimize and troubleshoot cloud storage at scale with Storage Monitoring"
7+
- link: "https://www.datadoghq.com/blog/storage-monitoring-recommendations/"
8+
tag: "Blog"
9+
text: "Reduce cloud storage costs and improve operational efficiency with Datadog Storage Monitoring"
10+
aliases:
11+
- /integrations/guide/storage-monitoring-setup
12+
---
13+
14+
## Overview
15+
16+
Storage Management for Amazon S3, Google Cloud Storage, and Azure Blob Storage provides deep, prefix-level analytics to help you understand exactly how your storage is being used. With Storage Management you can:
17+
- **Pinpoint where spend is coming from in your bucket**: Break storage cost to the prefix so you know which workloads, teams, or environments drive growth.
18+
- **Identify cold data**: Spot buckets with rarely accessed prefixes, and move cold data to lower-cost tiers.
19+
- **Tune retention and lifecycle rules with data**: Read/write and age metrics show when objects were last used, so you can shift unused prefixes to Glacier, Intelligent-Tiering, and other low-cost classes.
20+
- **Monitor data freshness**: Age metrics show how recently each prefix was updated, so you can confirm that backups and other time-sensitive data are landing in prefixes when they should.
21+
22+
You can access Storage Management in Datadog by navigating to **Infrastructure** > [**Storage Management**][1].
23+
24+
Use the guides below to set up Storage Management in Datadog for your cloud storage service.
25+
26+
{{< partial name="cloud_storage_monitoring/storage-monitoring-setup.html" >}}
27+
28+
[1]: https://app.datadoghq.com/storage-management

content/en/infrastructure/storage_management/amazon_s3.md

Lines changed: 421 additions & 0 deletions
Large diffs are not rendered by default.
Lines changed: 89 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,89 @@
1+
---
2+
title: Storage Management for Microsoft Azure Blob Storage
3+
further_reading:
4+
- link: "https://www.datadoghq.com/blog/datadog-storage-monitoring/"
5+
tag: "Blog"
6+
text: "Optimize and troubleshoot cloud storage at scale with Storage Monitoring"
7+
- link: "https://www.datadoghq.com/blog/storage-monitoring-recommendations/"
8+
tag: "Blog"
9+
text: "Reduce cloud storage costs and improve operational efficiency with Datadog Storage Monitoring"
10+
---
11+
12+
{{< callout url="https://www.datadoghq.com/product-preview/storage-monitoring/" >}}
13+
Storage Management is in Preview. Request access to start monitoring your object storage.
14+
{{< /callout >}}
15+
16+
## Setup
17+
18+
{{< tabs >}}
19+
{{% tab "Azure CLI" %}}
20+
21+
Enable inventories for the selected storage accounts in each subscription by running the following script in your [Azure Cloud Shell][301]:
22+
23+
```shell
24+
curl https://datadogstoragemonitoring.blob.core.windows.net/scripts/install.sh \
25+
| bash -s -- <CLIENT_ID> <SUBSCRIPTION_ID> <COMMA_SEPARATED_STORAGE_ACCOUNT_NAMES>
26+
```
27+
28+
Before running the script, set your [shell environment][302] to Bash and replace the various placeholder inputs with the correct values:
29+
- `<CLIENT_ID>`: The client ID of an App Registration already set up using the [Datadog Azure integration][302]
30+
- `<SUBSCRIPTION_ID>`: The subscription ID of the Azure subscription containing the storage accounts
31+
- `<COMMA_SEPARATED_STORAGE_ACCOUNT_NAMES>`: A comma-separated list of the storage accounts you want to monitor (for example, `storageaccount1,storageaccount2`)
32+
33+
[301]: https://shell.azure.com
34+
[302]: /integrations/azure/#setup
35+
[303]: https://learn.microsoft.com/en-us/azure/cloud-shell/get-started/classic?tabs=azurecli#select-your-shell-environment
36+
{{% /tab %}}
37+
38+
{{% tab "Azure Portal" %}}
39+
40+
For Each Storage Account you wish to monitor, follow all of the steps here:
41+
42+
### Create a blob inventory policy
43+
1. In the Azure portal, navigate to your Storage Account.
44+
2. Go to **Data management** > **Blob inventory**.
45+
3. Click **Add**.
46+
4. Configure the policy:
47+
- Name: **datadog-storage-monitoring**
48+
- Destination container:
49+
- Click **Create new**, and enter the name `datadog-storage-monitoring`.
50+
- Object type to inventory: **Blob**
51+
- Schedule: **Daily**
52+
- Blob types: Select **Block blobs**, **Append blobs**, and **Page blobs**.
53+
- Subtypes: Select **Include blob versions**
54+
- Schema fields: Select All, or ensure that at least the following are selected:
55+
- **Name**
56+
- **Access tier**
57+
- **Last modified**
58+
- **Content length**
59+
- **Server encrypted**
60+
- **Current version status**
61+
- **Version ID**
62+
- Exclude prefix: datadog-storage-monitoring
63+
5. Click **Add**.
64+
65+
### Add the role assignment
66+
1. In the Azure portal, navigate to your Storage Account.
67+
2. Go to **Data storage** > **Containers**.
68+
3. Click on the **datadog-storage-monitoring** container.
69+
4. Click on **Access control (IAM)** in the left-hand menu.
70+
5. Click **Add** > **Add role assignment**.
71+
6. Fill out the role assignment:
72+
- Role: Select **Storage Blob Data Reader**. Click **Next**.
73+
- Assign access to: **User, group, or service principal**.
74+
- Members: Click **+ Select members** and search for your App Registration by its name and select it.
75+
- **Note**: This should be an App Registration set up in the Datadog Azure integration. Keep in mind the Client ID for later.
76+
7. Click **Review + assign**.
77+
78+
{{% /tab %}}
79+
{{< /tabs >}}
80+
81+
### Post-Installation
82+
83+
After you finish with the above steps, fill out the [post-setup form][1].
84+
85+
## Further reading
86+
87+
{{< partial name="whats-next/whats-next.html" >}}
88+
89+
[1]: https://forms.gle/WXFbGyBwWfEo3gbM7
Lines changed: 210 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,210 @@
1+
---
2+
title: Storage Management for Google Cloud Storage
3+
further_reading:
4+
- link: "https://www.datadoghq.com/blog/datadog-storage-monitoring/"
5+
tag: "Blog"
6+
text: "Optimize and troubleshoot cloud storage at scale with Storage Monitoring"
7+
- link: "https://www.datadoghq.com/blog/storage-monitoring-recommendations/"
8+
tag: "Blog"
9+
text: "Reduce cloud storage costs and improve operational efficiency with Datadog Storage Monitoring"
10+
---
11+
12+
{{< callout url="https://www.datadoghq.com/product-preview/storage-monitoring/" >}}
13+
Storage Management is in Preview. Request access to start monitoring your object storage.
14+
{{< /callout >}}
15+
16+
## Setup
17+
18+
### Step 1: Install the Google Cloud integration and enable resource collection
19+
20+
To collect Google Cloud Storage metrics from your Google Cloud project, install the Google Cloud integration in Datadog. Enable Resource Collection for the project containing the buckets you want to monitor. Resource Collection allows Datadog to associate your buckets' labels with the metrics collected through Storage Management.
21+
22+
**Note**: While you can disable specific metric namespaces, keep the Cloud Storage namespace (gcp.storage) enabled.
23+
24+
### Step 2: Enable the Storage Insights API
25+
26+
Enable the [Storage Insights][2] API in your Google Cloud project.
27+
28+
### Step 3: Grant service agent permissions
29+
30+
After enabling the Storage Insights API, a project-level service agent is created automatically with the following format: `service-PROJECT_NUMBER@gcp-sa-storageinsights.iam.gserviceaccount.com`
31+
32+
The service agent requires these IAM roles:
33+
34+
1. `roles/storage.insightsCollectorService` on the source bucket (includes storage.buckets.getObjectInsights and storage.buckets.get permissions)
35+
2. `roles/storage.objectCreator` on the destination bucket (includes the storage.objects.create permission)
36+
37+
### Step 4: Create an inventory report configuration
38+
39+
You can create an inventory report configuration in multiple ways. The quickest methods use the Google Cloud CLI or Terraform templates. Regardless of the method, ensure the configuration:
40+
41+
1. Includes these metadata fields: `"bucket", "name", "project", "size", "updated", "storageClass"`
42+
2. Generates CSV reports with `'\n'` as the delimiter and `','` as the separator
43+
3. Uses this destination path format: `<BUCKET>/{{date}}`, where `<BUCKET>` is the monitored bucket-name
44+
45+
{{< tabs >}}
46+
{{% tab "Google Cloud CLI" %}}
47+
48+
Use the [Google Cloud CLI][301] to run the following command:
49+
50+
```
51+
gcloud storage insights inventory-reports create <SOURCE_BUCKET_URL> \
52+
--no-csv-header \
53+
--display-name=datadog-storage-monitoring \
54+
--destination=gs://<DESTINATION_BUCKET>/<SOURCE_BUCKET>/{{date}}> \
55+
--metadata-fields=project,bucket,name,size,updated,storageClass \
56+
--schedule-starts=<YYYY-MM-DD> \
57+
--schedule-repeats=<DAILY|WEEKLY> \
58+
--schedule-repeats-until=<YYYY-MM-DD>
59+
```
60+
61+
[301]: https://cloud.google.com/storage/docs/insights/using-inventory-reports#create-config-cli
62+
63+
{{% /tab %}}
64+
{{% tab "Terraform" %}}
65+
66+
Copy the following Terraform template, substitute the necessary arguments, and apply it in the Google Cloud project that contains your bucket.
67+
68+
<!-- vale off -->
69+
{{% collapse-content title="Terraform configuration for inventory reports" level="h4" expanded=true %}}
70+
71+
```hcl
72+
locals {
73+
source_bucket = "" # The name of the bucket you want to monitor
74+
destination_bucket = "" # The bucket where inventory reports are written
75+
frequency = "" # Possible values: Daily, Weekly (report generation frequency)
76+
location = "" # The location of your source and destination buckets
77+
}
78+
79+
data "google_project" "project" {
80+
}
81+
82+
resource "google_storage_insights_report_config" "config" {
83+
display_name = "datadog-storage-monitoring"
84+
location = local.location
85+
frequency_options {
86+
frequency = local.frequency
87+
start_date {
88+
day = "" # Fill in the day
89+
month = "" # Fill in the month
90+
year = "" # Fill in the year
91+
}
92+
end_date {
93+
day = "" # Fill in the day
94+
month = "" # Fill in the month
95+
year = "" # Fill in the year
96+
}
97+
}
98+
csv_options {
99+
record_separator = "\n"
100+
delimiter = ","
101+
header_required = false
102+
}
103+
object_metadata_report_options {
104+
metadata_fields = ["bucket", "name", "project", "size", "updated", "storageClass"]
105+
storage_filters {
106+
bucket = local.source_bucket
107+
}
108+
storage_destination_options {
109+
bucket = google_storage_bucket.report_bucket.name
110+
destination_path = "${local.source_bucket}/{{date}}"
111+
}
112+
}
113+
114+
depends_on = [
115+
google_storage_bucket_iam_member.admin
116+
]
117+
}
118+
119+
resource "google_storage_bucket" "report_bucket" {
120+
name = local.destination_bucket
121+
location = local.location
122+
force_destroy = true
123+
uniform_bucket_level_access = true
124+
}
125+
126+
resource "google_storage_bucket_iam_member" "admin" {
127+
bucket = google_storage_bucket.report_bucket.name
128+
role = "roles/storage.admin"
129+
member = "serviceAccount:service-${data.google_project.project.number}@gcp-sa-storageinsights.iam.gserviceaccount.com"
130+
}
131+
```
132+
133+
{{% /collapse-content %}}
134+
<!-- vale on -->
135+
136+
{{% /tab %}}
137+
{{% tab "Allow Datadog to create the configuration on your behalf" %}}
138+
139+
You can allow Datadog to handle the inventory report configuration by providing the proper permissions to your service account:
140+
141+
1. Navigate to IAM & Admin -> Service accounts
142+
2. Find your Datadog service account and add the `roles/storageinsights.Admin` role
143+
3. Navigate to the source bucket you want to monitor and grant these permissions:
144+
- `roles/storage.insightsCollectorService`
145+
- `roles/storage.ObjectViewer`
146+
4. Navigate to the destination bucket and grant these permissions:
147+
- `roles/storage.objectCreator`
148+
- `roles/storage.insightsCollectorService`
149+
150+
Alternatively, you can create a custom role specifically for Datadog with these required permissions:
151+
152+
```
153+
storage.buckets.get
154+
storage.objects.list
155+
storage.buckets.getObjectInsights
156+
storage.buckets.get
157+
storage.objects.create
158+
storageinsights.reportConfigs.get
159+
storageinsights.reportConfigs.create
160+
storageinsights.reportConfigs.list
161+
storageinsights.reportConfigs.update
162+
storage.objects.get
163+
storageinsights.reportDetails.get
164+
storageinsights.reportDetails.list
165+
```
166+
167+
After granting the necessary permissions, Datadog can create the inventory report configuration with your setup details.
168+
169+
{{% /tab %}}
170+
{{< /tabs >}}
171+
172+
### Step 5: Add the Storage Object Viewer role to your Datadog service account
173+
174+
Grant Datadog permission to access and extract the generated inventory reports from Google. This permission should be on the destination bucket where the inventory reports are stored.
175+
176+
1. Select the destination bucket for your inventory reports
177+
2. In the bucket details page, click the Permissions tab
178+
3. Under Permissions, click Grant Access to add a new principal
179+
4. Principal: Enter the Datadog Service Account email
180+
5. Role: Select Storage Object Viewer (`roles/storage.objectViewer`)
181+
182+
### Post-setup steps
183+
184+
After completing the setup steps, fill out the [post-setup][3] form with the following required information:
185+
1. Name of the destination bucket holding the inventory files
186+
2. Name of the service account with the granted permissions
187+
3. Prefix where the files are stored in the destination bucket (if any)
188+
4. Name of the source bucket you want to monitor (the bucket producing inventory files)
189+
5. Google Cloud location of the destination bucket holding the inventory files
190+
6. Google Cloud ProjectID containing the buckets
191+
7. Datadog org name
192+
193+
### Validation
194+
195+
To verify your setup:
196+
1. Wait for the first inventory report to generate (up to 24 hours for daily reports or 7 days for weekly reports).
197+
2. Check the destination bucket for inventory files.
198+
3. Confirm the Datadog integration can access the files.
199+
4. Navigate to **Infrastructure** > **Storage Management** > **Installation Recommendations** to see if your configured bucket appears in the list.
200+
201+
### Troubleshooting
202+
203+
If you encounter any issues or need assistance:
204+
- Use only one destination bucket for all inventory files per Google Cloud project.
205+
- Verify all permissions are correctly configured.
206+
- If issues persist, [contact Datadog][1] with your bucket details, Google Cloud Project ID, and Datadog org name.
207+
208+
[1]: mailto:storage-monitoring@datadoghq.com
209+
[2]: https://cloud.google.com/storage/docs/insights/using-inventory-reports#enable_the_api
210+
[3]: https://forms.gle/c7b8JiLENDaUEqGk8

content/en/integrations/guide/azure-native-integration.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ Some features cannot be managed through the Datadog resource in Azure. These inc
5454
- Metric filtering at the **resource** level
5555
- [Cloud Cost Management][8] (CCM)
5656
- [Log Archiving][9]
57-
- [Storage Monitoring][7]
57+
- [Storage Management][7]
5858

5959
## Setup
6060

@@ -434,7 +434,7 @@ To uninstall the Datadog extension, select the appropriate app, then click **Uni
434434
[3]: https://docs.microsoft.com/cli/azure/datadog?view=azure-cli-latest
435435
[5]: https://app.datadoghq.com/dash/integration/71/azure-overview
436436
[6]: https://app.datadoghq.com/monitors/templates?q=azure
437-
[7]: /integrations/guide/storage-monitoring-setup/#setup-for-azure-blob-storage
437+
[7]: /infrastructure/storage_management/azure_blob_storage
438438
[8]: /cloud_cost_management/setup/azure/
439439
[9]: /logs/guide/azure-automated-log-forwarding/#log-archiving
440440
[10]: https://docs.microsoft.com/azure/azure-resource-manager/management/control-plane-and-data-plane

0 commit comments

Comments
 (0)