diff --git a/tools/content_manager/README.md b/tools/content_manager/README.md index 6cb1ae5..48d2980 100644 --- a/tools/content_manager/README.md +++ b/tools/content_manager/README.md @@ -4,9 +4,9 @@ Content Manager is a command-line tool that can be used to manage content in [Google SecOps](https://cloud.google.com/security/products/security-operations) -such as rules, data, tables, reference lists, and rule exclusions. Content -Manager can be utilized in a CI/CD pipeline to implement Detection-as-Code with -Google SecOps or ran locally using +such as rules, data, tables, reference lists, rule exclusions, and saved +searches. Content Manager can be utilized in a CI/CD pipeline to implement +Detection-as-Code with Google SecOps or ran locally using [Application Default Credentials (ADC)](https://cloud.google.com/docs/authentication/application-default-credentials) for authentication. If you're new to the concept of managing detection rules and other content using @@ -31,6 +31,7 @@ in a CI/CD pipeline (in GitHub, GitLab, CircleCI, etc) to do the following: * Retrieve the latest version of all reference lists from Google SecOps and write them to local files along with their current state/configuration * Create or update reference lists in Google SecOps based on local files * Manage [rule exclusions](https://cloud.google.com/chronicle/docs/detection/rule-exclusions) in Google SecOps based on a local config file +* Manage [saved searches](https://docs.cloud.google.com/chronicle/docs/investigation/udm-search#search-manager) in Google SecOps based on a local config file Sample detection rules can be found in the [Google SecOps Detection Rules](https://github.com/chronicle/detection-rules/tree/main) repo. @@ -136,6 +137,11 @@ chronicle.findingsRefinements.create chronicle.findingsRefinements.get chronicle.findingsRefinements.list chronicle.findingsRefinements.update +# Permissions required to managed saved searches +chronicle.searchQueries.get +chronicle.searchQueries.list +chronicle.searchQueries.create +chronicle.searchQueries.update ``` If you're unable to configure your CI/CD pipeline to authenticate using Workload @@ -210,6 +216,7 @@ Commands: reference-lists Manage reference lists. rule-exclusions Manage rule exclusions. rules Manage rules. + saved-searches Manage saved searches. ``` A logical first step after reading the contents of this readme file and @@ -657,6 +664,98 @@ Example output from update remote rule exclusions command. 01-May-25 12:15:36 MDT | INFO | dump_rule_exclusion_config | Writing rule exclusion config to /Users/x/Documents/projects/detection-rules/tools/content_manager/rule_exclusions_config.yaml ``` +## Managing saved searches in Google SecOps + +### Retrieve saved searches from Google SecOps + +The `saved-searches get` command retrieves the latest version of all saved +searches from Google SecOps and writes them to a `saved_search_config.yaml` +file. + +The saved search content, configuration, and metadata is written to the +`saved_search_config.yaml` file. + +Example output from `saved-searches get` command: + +``` +(venv) $ python -m content_manager saved-searches get +10-Nov-25 14:11:37 MST | INFO | | Content Manager started +10-Nov-25 14:11:37 MST | INFO | get_saved_searches | Attempting to pull latest version of all saved searches from Google SecOps and update the local config file +10-Nov-25 14:11:38 MST | INFO | get_remote_saved_searches | Attempting to retrieve all saved searches from Google SecOps +10-Nov-25 14:11:38 MST | INFO | get_remote_saved_searches | Retrieved 11 saved searches +10-Nov-25 14:11:38 MST | INFO | get_remote_saved_searches | Retrieved a total of 11 saved searches +10-Nov-25 14:11:38 MST | INFO | dump_saved_search_config | Writing saved search config to /Users/x/Documents/projects/detection-rules/tools/content_manager/saved_search_config.yaml +``` + +### Update saved searches in Google SecOps + +The `saved-searches update` command updates saved searches in Google +SecOps based on the local config file (`saved_search_config.yaml`). + +Saved search updates include: + +* Create a new saved search +* Update the display name (title) for a saved search +* Update the query for a saved search +* Update the description for a saved search +* Update the sharing settings for a saved search +* Update placeholder variable and placeholder variable descriptions for a +saved search + +Please refer to the example saved searches in the `saved_search_config.yaml` +file to understand the expected format for these files. + +To create a new saved search, add a new entry to the +`saved_search_config.yaml` file and execute the `saved-searches update` +command. Please see the example below. + +``` +Top 10 Suricata Rules: + description: Statistical Search Workshop + query: |- + metadata.vendor_name = "Suricata" nocase + $rule_name = security_result.rule_name + match: + $rule_name + outcome: + $event_count = count_distinct(metadata.id) + order: + $event_count desc + limit: + 10 + sharing_mode: MODE_SHARED_WITH_CUSTOMER +``` + +Existing saved searches can be updated by modifying the +`saved_search_config.yaml` file and executing the `saved-searches update` +command. + +Example output from update saved searches command. + +``` +(venv) $ python -m content_manager saved-searches update +10-Nov-25 14:25:46 MST | INFO | | Content Manager started +10-Nov-25 14:25:46 MST | INFO | update_saved_searches | Attempting to update saved searches in Google SecOps based on the local config file +10-Nov-25 14:25:46 MST | INFO | update_remote_saved_searches | Attempting to update saved searches in Google SecOps based on local config file /Users/x/Documents/projects/detection-rules/tools/content_manager/saved_search_config.yaml +10-Nov-25 14:25:46 MST | INFO | load_saved_search_config | Loading saved search config from /Users/x/Documents/projects/detection-rules/tools/content_manager/saved_search_config.yaml +10-Nov-25 14:25:46 MST | INFO | load_saved_search_config | Loaded 12 saved search config entries from file /Users/x/Documents/projects/detection-rules/tools/content_manager/saved_search_config.yaml +10-Nov-25 14:25:46 MST | INFO | update_remote_saved_searches | Attempting to retrieve latest version of all saved searches from Google SecOps +10-Nov-25 14:25:46 MST | INFO | get_remote_saved_searches | Attempting to retrieve all saved searches from Google SecOps +10-Nov-25 14:25:47 MST | INFO | get_remote_saved_searches | Retrieved 12 saved searches +10-Nov-25 14:25:47 MST | INFO | get_remote_saved_searches | Retrieved a total of 12 saved searches +10-Nov-25 14:25:47 MST | INFO | update_remote_saved_searches | Checking if any saved search updates are required +10-Nov-25 14:25:47 MST | INFO | update_remote_saved_searches | Saved search Top 10 Suricata Rules - Description for local and remote saved search is different. Remote saved search will be updated +10-Nov-25 14:25:47 MST | INFO | update_remote_saved_searches | Saved search Top 10 Suricata Rules - Updating remote saved search +10-Nov-25 14:25:48 MST | INFO | update | Logging summary of saved search changes... +10-Nov-25 14:25:48 MST | INFO | update | Saved searches created: 0 +10-Nov-25 14:25:48 MST | INFO | update | Saved searches updated: 1 +10-Nov-25 14:25:48 MST | INFO | update | updated saved search ('Top 10 Suricata Rules', 'projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/baf471b7-067f-4a73-91c4-12cff0c0c29c') +10-Nov-25 14:25:48 MST | INFO | get_remote_saved_searches | Attempting to retrieve all saved searches from Google SecOps +10-Nov-25 14:25:49 MST | INFO | get_remote_saved_searches | Retrieved 12 saved searches +10-Nov-25 14:25:49 MST | INFO | get_remote_saved_searches | Retrieved a total of 12 saved searches +10-Nov-25 14:25:49 MST | INFO | dump_saved_search_config | Writing saved search config to /Users/x/Documents/projects/detection-rules/tools/content_manager/saved_search_config.yaml +``` + ## Need help? Please open an issue in this repo or reach out in the Google Cloud Security [community](https://secopscommunity.com). diff --git a/tools/content_manager/content_manager/__main__.py b/tools/content_manager/content_manager/__main__.py index 29a50cc..5c8c7c3 100644 --- a/tools/content_manager/content_manager/__main__.py +++ b/tools/content_manager/content_manager/__main__.py @@ -30,6 +30,7 @@ from content_manager.reference_lists import ReferenceLists from content_manager.rule_exclusions import RuleExclusions from content_manager.rules import Rules +from content_manager.saved_searches import SavedSearches import dotenv import google.auth.transport.requests from google_secops_api import auth @@ -48,6 +49,7 @@ DATA_TABLES_DIR = ROOT_DIR / "data_tables" DATA_TABLE_CONFIG_FILE = ROOT_DIR / "data_table_config.yaml" RULE_EXCLUSIONS_CONFIG_FILE = ROOT_DIR / "rule_exclusions_config.yaml" +SAVED_SEARCH_CONFIG_FILE = ROOT_DIR / "saved_search_config.yaml" dotenv.load_dotenv() @@ -454,6 +456,48 @@ def update(cls): RuleExclusionOperations.get() +class SavedSearchOperations: + """Manage saved searches in Google SecOps.""" + + @classmethod + def get(cls): + """Retrieves the latest version of saved searches from Google SecOps and updates the local config file.""" + http_session = initialize_http_session() + + remote_saved_searches = SavedSearches.get_remote_saved_searches( + http_session=http_session + ) + + if not remote_saved_searches.saved_searches: + LOGGER.info("No saved searches retrieved") + return + + remote_saved_searches.dump_saved_search_config() + + @classmethod + def update(cls): + """Update saved searches in Google SecOps based on local config file.""" + http_session = initialize_http_session() + + saved_search_updates = SavedSearches.update_remote_saved_searches( + http_session=http_session + ) + + if not saved_search_updates: + return + + # Log summary of saved search updates that occurred. + LOGGER.info("Logging summary of saved search changes...") + for update_type, saved_search_names in saved_search_updates.items(): + LOGGER.info("Saved searches %s: %s", update_type, len(saved_search_names)) + for saved_search_name in saved_search_names: + LOGGER.info("%s saved search %s", update_type, saved_search_name) + + # Retrieve the latest version of all saved searches after any changes + # were made. + SavedSearchOperations.get() + + @click.group() def cli(): """Content Manager - Manage content in Google SecOps such as rules, data tables, reference lists, and exclusions.""" @@ -727,6 +771,39 @@ def update(): RuleExclusionOperations.update() +@click.group() +def saved_searches(): + """Manage saved searches.""" + + +@saved_searches.command( + "get", + short_help="""Retrieve the latest version of all saved searches from Google SecOps and updates the local config file.""", +) +def get_saved_searches(): + """Retrieve the latest version of all saved searches from Google SecOps and update the local config file.""" + LOGGER.info( + "Attempting to pull latest version of all saved searches from Google " + "SecOps and update the local config file" + ) + SavedSearchOperations.get() + + +@saved_searches.command( + "update", + short_help=( + "Update saved searches in Google SecOps based on the local config file." + ), +) +def update_saved_searches(): + """Update saved searches in Google SecOps based on the local config file.""" + LOGGER.info( + "Attempting to update saved searches in Google SecOps based on the local" + " config file" + ) + SavedSearchOperations.update() + + if __name__ == "__main__": LOGGER.info("Content Manager started") @@ -740,10 +817,12 @@ def update(): REF_LIST_CONFIG_FILE.touch(exist_ok=True) DATA_TABLE_CONFIG_FILE.touch(exist_ok=True) RULE_EXCLUSIONS_CONFIG_FILE.touch(exist_ok=True) + SAVED_SEARCH_CONFIG_FILE.touch(exist_ok=True) cli.add_command(rules) cli.add_command(data_tables) cli.add_command(reference_lists) cli.add_command(rule_exclusions) + cli.add_command(saved_searches) cli() diff --git a/tools/content_manager/content_manager/common/custom_exceptions.py b/tools/content_manager/content_manager/common/custom_exceptions.py index 6057465..9ff57b5 100644 --- a/tools/content_manager/content_manager/common/custom_exceptions.py +++ b/tools/content_manager/content_manager/common/custom_exceptions.py @@ -45,3 +45,7 @@ class ReferenceListConfigError(Exception): class RuleExclusionConfigError(Exception): """Raised when an issue with the rule exclusion config file is found.""" + + +class SavedSearchConfigError(Exception): + """Raised when an issue with the saved search config file is found.""" diff --git a/tools/content_manager/content_manager/saved_searches.py b/tools/content_manager/content_manager/saved_searches.py new file mode 100644 index 0000000..201b19e --- /dev/null +++ b/tools/content_manager/content_manager/saved_searches.py @@ -0,0 +1,502 @@ +# Copyright 2025 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +"""Manage saved searches in Google SecOps.""" + +import json +import logging +import pathlib +from typing import Any, Literal + +from content_manager.common.custom_exceptions import SavedSearchConfigError +from google.auth.transport import requests +from google_secops_api.saved_searches.create_saved_search import create_saved_search +from google_secops_api.saved_searches.list_saved_searches import list_saved_searches +from google_secops_api.saved_searches.update_saved_search import update_saved_search +import pydantic +import ruamel.yaml +import ruamel.yaml.scalarstring.LiteralScalarString + + +LOGGER = logging.getLogger() + +ROOT_DIR = pathlib.Path(__file__).parent.parent +SAVED_SEARCH_CONFIG_FILE = ROOT_DIR / "saved_search_config.yaml" +SHARING_MODES = Literal["MODE_SHARED_WITH_CUSTOMER"] # pylint: disable="invalid-name" + +# Use ruamel.yaml to raise an exception if a YAML file contains duplicate keys +ruamel_yaml = ruamel.yaml.YAML() +ruamel_yaml.default_flow_style = False + + +class SavedSearch(pydantic.BaseModel): + """Class for a saved search.""" + + name: str + resource_name: str | None + query_id: str | None + user_id: str | None + create_time: str | None + update_time: str | None + description: str | None + query: str + sharing_mode: SHARING_MODES | None + query_type: str | None + placeholder_names: list[str] | None + placeholder_descriptions: list[str] | None + + +class SavedSearchConfigEntry(pydantic.BaseModel): + """Class for a saved search config file entry.""" + + name: str + resource_name: str | None + query_id: str | None + user_id: str | None + create_time: str | None + update_time: str | None + description: str | None + query: str + sharing_mode: SHARING_MODES | None + query_type: str | None + placeholder_names: list[str] | None + placeholder_descriptions: list[str] | None + + +class SavedSearches: + """Class used to manage saved searches in Google SecOps.""" + + def __init__(self, saved_searches: list[SavedSearch]): + self.saved_searches: list[SavedSearch] = saved_searches + + @classmethod + def parse_saved_search(cls, saved_search: dict[str, Any]) -> SavedSearch: + """Parse a saved search into a SavedSearch object.""" + try: + parsed_saved_search = SavedSearch( + name=saved_search["displayName"], + resource_name=saved_search.get("name"), + query_id=saved_search.get("queryId"), + user_id=saved_search.get("userId"), + create_time=saved_search["metadata"]["createTime"], + update_time=saved_search["metadata"]["updateTime"], + description=saved_search.get("description"), + query=saved_search["query"], + sharing_mode=saved_search["metadata"].get("sharingMode"), + query_type=saved_search.get("queryType"), + placeholder_names=saved_search.get("placeholderNames"), + placeholder_descriptions=saved_search.get("placeholderDescriptions"), + ) + except pydantic.ValidationError as e: + LOGGER.error( + "ValidationError occurred for saved search %s\n%s", + saved_search, + json.dumps(e.errors(), indent=4), + ) + raise + + return parsed_saved_search + + @classmethod + def parse_saved_searches( + cls, saved_searches: list[dict[str, Any]] + ) -> list[SavedSearch]: + """Parse a list of saved searches into a list of SavedSearch objects.""" + parsed_saved_searches = [] + + for saved_search in saved_searches: + parsed_saved_searches.append( + SavedSearches.parse_saved_search(saved_search) + ) + + return parsed_saved_searches + + @classmethod + def load_saved_search_config( + cls, saved_search_config_file: pathlib.Path = SAVED_SEARCH_CONFIG_FILE + ) -> "SavedSearches": + """Load saved search config from file.""" + LOGGER.info( + "Loading saved search config from %s", + saved_search_config_file, + ) + with open(saved_search_config_file, "r", encoding="utf-8") as f: + saved_search_config = ruamel_yaml.load(f) + + if not saved_search_config: + LOGGER.info("Saved search config file is empty.") + return SavedSearches(saved_searches=[]) + + SavedSearches.check_saved_search_config(saved_search_config) + + saved_searches_parsed = [] + + for ( + saved_search_name, + saved_search_config_entry, + ) in saved_search_config.items(): + try: + saved_searches_parsed.append( + SavedSearch( + name=saved_search_name, + resource_name=saved_search_config_entry.get("resource_name"), + query_id=saved_search_config_entry.get("query_id"), + user_id=saved_search_config_entry.get("user_id"), + create_time=saved_search_config_entry.get("create_time"), + update_time=saved_search_config_entry.get("update_time"), + description=saved_search_config_entry.get("description"), + query=saved_search_config_entry["query"], + sharing_mode=saved_search_config_entry.get("sharing_mode"), + query_type=saved_search_config_entry.get("query_type"), + placeholder_names=saved_search_config_entry.get( + "placeholder_names" + ), + placeholder_descriptions=saved_search_config_entry.get( + "placeholder_descriptions" + ), + ) + ) + except pydantic.ValidationError as e: + LOGGER.error( + "ValidationError occurred for saved search config entry %s\n%s", + saved_search_name, + json.dumps(e.errors(), indent=4), + ) + raise + + LOGGER.info( + "Loaded %s saved search config entries from file %s", + len(saved_searches_parsed), + saved_search_config_file, + ) + + return SavedSearches(saved_searches=saved_searches_parsed) + + @classmethod + def check_saved_search_config(cls, config: dict[str, Any]): + """Check saved search config file for invalid keys.""" + required_keys = ["query"] + allowed_keys = [ + "create_time", + "description", + "placeholder_names", + "placeholder_descriptions", + "query", + "query_id", + "query_type", + "resource_name", + "sharing_mode", + "update_time", + "user_id", + ] + invalid_keys = [] + + for saved_search_name, saved_search_config in config.items(): + for key in list(saved_search_config.keys()): + if key not in allowed_keys: + invalid_keys.append(key) + + if invalid_keys: + raise SavedSearchConfigError( + f"Invalid keys ({invalid_keys}) found for saved search -" + f" {saved_search_name}" + ) + + for key in required_keys: + if key not in list(saved_search_config.keys()): + raise SavedSearchConfigError( + f"Required key ({key}) not found for saved search -" + f" {saved_search_name}" + ) + + def dump_saved_search_config(self): + """Dump the configuration and metadata for a collection of saved searches.""" + saved_search_config = {} + + for saved_search in self.saved_searches: + try: + saved_search_config_entry = SavedSearchConfigEntry( + name=saved_search.name, + resource_name=saved_search.resource_name, + query_id=saved_search.query_id, + user_id=saved_search.user_id, + create_time=saved_search.create_time, + update_time=saved_search.update_time, + description=saved_search.description, + query=saved_search.query, + sharing_mode=saved_search.sharing_mode, + query_type=saved_search.query_type, + placeholder_names=saved_search.placeholder_names, + placeholder_descriptions=saved_search.placeholder_descriptions, + ) + except pydantic.ValidationError as e: + LOGGER.error( + "ValidationError occurred for saved search config entry %s\n%s", + saved_search, + json.dumps(e.errors(), indent=4), + ) + raise + + saved_search_config[saved_search.name] = ( + saved_search_config_entry.model_dump(exclude={"name"}) + ) + + # Use ruamel.yaml.scalarstring import LiteralScalarString on the Saved + # Search query field to force the YAML dumper to use the "literal block + # style" (denoted by the | character) when writing multi-line strings to + # a file. + if "\n" in saved_search_config[saved_search.name]["query"]: + saved_search_config[saved_search.name]["query"] = ( + ruamel.yaml.scalarstring.LiteralScalarString(saved_search.query) + ) + + LOGGER.info("Writing saved search config to %s", SAVED_SEARCH_CONFIG_FILE) + with open( + SAVED_SEARCH_CONFIG_FILE, "w", encoding="utf-8" + ) as saved_search_config_file: + ruamel_yaml.dump( + saved_search_config, + saved_search_config_file, + ) + + @classmethod + def get_remote_saved_searches( + cls, http_session: requests.AuthorizedSession + ) -> "SavedSearches": + """Retrieve the latest version of all saved searches from Google SecOps.""" + raw_saved_searches = [] + next_page_token = None + + LOGGER.info("Attempting to retrieve all saved searches from Google SecOps") + while True: + ( + retrieved_saved_searches, + next_page_token, + ) = list_saved_searches( + http_session=http_session, + page_size=None, + page_token=next_page_token, + ) + + if retrieved_saved_searches is not None: + LOGGER.info( + "Retrieved %s saved searches", + len(retrieved_saved_searches), + ) + raw_saved_searches.extend(retrieved_saved_searches) + + if next_page_token: + LOGGER.info( + "Attempting to retrieve saved searches with page token %s", + next_page_token, + ) + else: + # Break if there are no more pages of saved searches to retrieve + break + + raw_saved_searches_count = len(raw_saved_searches) + + LOGGER.info( + "Retrieved a total of %s saved searches", raw_saved_searches_count + ) + + if not raw_saved_searches: + return SavedSearches(saved_searches=[]) + + parsed_saved_searches = SavedSearches.parse_saved_searches( + saved_searches=raw_saved_searches + ) + + return SavedSearches(saved_searches=parsed_saved_searches) + + @classmethod + def update_remote_saved_searches( + cls, + http_session: requests.AuthorizedSession, + saved_searches_config_file: pathlib.Path = SAVED_SEARCH_CONFIG_FILE, + ) -> dict[str, list[tuple[str, str]]] | None: + """Update saved searches in Google SecOps based on a local config file.""" + LOGGER.info( + "Attempting to update saved searches in Google SecOps based on local" + " config file %s", + saved_searches_config_file, + ) + local_saved_searches = SavedSearches.load_saved_search_config() + + if not local_saved_searches.saved_searches: + return None + + LOGGER.info( + "Attempting to retrieve latest version of all saved searches from" + " Google SecOps" + ) + remote_saved_searches = SavedSearches.get_remote_saved_searches( + http_session=http_session + ) + + # Create a dictionary containing the remote saved searches using the saved + # search's Google Cloud resource name as the key for each item. + remote_saved_searches_dict = {} + + if remote_saved_searches.saved_searches: + for remote_saved_search in remote_saved_searches.saved_searches: + remote_saved_searches_dict[remote_saved_search.resource_name] = ( + remote_saved_search + ) + + # Keep track of saved search updates to log a final summary of changes + # made. + update_summary = { + "created": [], + "updated": [], + } + + LOGGER.info("Checking if any saved search updates are required") + for local_saved_search in local_saved_searches.saved_searches: + saved_search_name = local_saved_search.name + saved_search_resource_name = local_saved_search.resource_name + update_remote_saved_search = False + + # If the local saved search doesn't have a Google Cloud resource name, + # create a new saved search in Google SecOps + if not saved_search_resource_name: + new_saved_search = create_saved_search( + http_session=http_session, + name=local_saved_search.name, + query=local_saved_search.query, + description=local_saved_search.description, + sharing_mode=local_saved_search.sharing_mode, + placeholder_names=local_saved_search.placeholder_names, + placeholder_descriptions=local_saved_search.placeholder_descriptions, + ) + saved_search_resource_name = new_saved_search["name"] + local_saved_search.resource_name = new_saved_search["name"] + remote_saved_search = SavedSearches.parse_saved_search(new_saved_search) + LOGGER.info("Created new saved search %s", remote_saved_search.name) + update_summary["created"].append( + (remote_saved_search.name, saved_search_resource_name) + ) + + else: + # Saved search exists in Google SecOps with same Google Cloud resource + # name as local saved search. + remote_saved_search = remote_saved_searches_dict[ + saved_search_resource_name + ] + + # Check if the saved search's name should be updated + LOGGER.debug( + "Saved search %s - Comparing the name of the local and remote" + " saved search", + saved_search_name, + ) + if local_saved_search.name != remote_saved_search.name: + LOGGER.info( + "Saved search %s - Name for local and remote saved search is" + " different. Remote saved search will be updated", + saved_search_name, + ) + update_remote_saved_search = True + + # Check if the saved search's description should be updated + LOGGER.debug( + "Saved search %s - Comparing the description of the local and" + " remote saved search", + saved_search_name, + ) + if local_saved_search.description != remote_saved_search.description: + LOGGER.info( + "Saved search %s - Description for local and remote saved search" + " is different. Remote saved search will be updated", + saved_search_name, + ) + update_remote_saved_search = True + + # Check if the saved search's query should be updated + LOGGER.debug( + "Saved search %s - Comparing the query for the local and remote" + " saved search", + saved_search_name, + ) + if local_saved_search.query != remote_saved_search.query: + LOGGER.info( + "Saved search %s - Query is different in local and remote saved" + " search. Remote saved search will be updated", + saved_search_name, + ) + update_remote_saved_search = True + + if local_saved_search.sharing_mode != remote_saved_search.sharing_mode: + LOGGER.info( + "Saved search %s - Sharing mode is different in local and remote" + " saved search. Remote saved search will be updated", + saved_search_name, + ) + update_remote_saved_search = True + + if ( + local_saved_search.placeholder_names + != remote_saved_search.placeholder_names + ): + LOGGER.info( + "Saved search %s - Placeholder names are different in local and " + "remote saved search. Remote saved search will be updated", + saved_search_name, + ) + update_remote_saved_search = True + + if ( + local_saved_search.placeholder_descriptions + != remote_saved_search.placeholder_descriptions + ): + LOGGER.info( + "Saved search %s - Placeholder descriptions are different in" + " local and remote saved search. Remote saved search will be" + " updated", + saved_search_name, + ) + update_remote_saved_search = True + + if update_remote_saved_search: + LOGGER.info( + "Saved search %s - Updating remote saved search", + saved_search_name, + ) + # Note on November 7, 2025. There is a bug with the API method used + # to update (PATCH) saved searches. A bug has been filed for this. + # Providing an update_mask and a value for "description" (as an + # example) will update the description for the saved search, but + # delete the values from the other fields. This breaks the saved + # search in Google SecOps (e.g. the display name (Title) for the saved + # search is deleted). + update_saved_search( + http_session=http_session, + resource_name=local_saved_search.resource_name, + updates={ + "displayName": local_saved_search.name, + "description": local_saved_search.description, + "query": local_saved_search.query, + "placeholder_names": local_saved_search.placeholder_names, + "placeholder_descriptions": ( + local_saved_search.placeholder_descriptions + ), + "metadata": {"sharing_mode": local_saved_search.sharing_mode}, + }, + ) + + update_summary["updated"].append( + (saved_search_name, saved_search_resource_name) + ) + + return update_summary diff --git a/tools/content_manager/content_manager/test_data/test_saved_search_config.yaml b/tools/content_manager/content_manager/test_data/test_saved_search_config.yaml new file mode 100644 index 0000000..baca7dd --- /dev/null +++ b/tools/content_manager/content_manager/test_data/test_saved_search_config.yaml @@ -0,0 +1,43 @@ +Blocked Windows Logins by Host: + resource_name: + projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/74257fc4-cb92-497d-84be-c9b5bfcd287c + query_id: Ab9e5XWvQTu46prKxvQx6Q== + user_id: player1@example.com + create_time: '2025-11-07T16:17:40.197090Z' + update_time: '2025-11-07T16:17:40.197090Z' + description: Statistical Search Workshop + query: |- + metadata.vendor_name = "Microsoft" AND metadata.product_name = /Windows/ AND metadata.event_type = "USER_LOGIN" AND security_result.action = "BLOCK" AND principal.hostname != "" + $host = principal.hostname + $user = target.user.userid + match: + $host + outcome: + $user_distinct_count = count_distinct($user) + $user_count = count($user) + $users_uniq_list = array_distinct($user) + order: + $user_count desc, $user_distinct_count desc + sharing_mode: + query_type: + placeholder_names: + placeholder_descriptions: +Rule Changes: + resource_name: + projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/1ff8cafa-4a34-47eb-b26b-c71abda0ef8b + query_id: Jy9qEghrTcAAA3vBxoymPQ== + user_id: player1@example.com + create_time: '2025-04-16T16:19:56.609898Z' + update_time: '2025-11-10T17:05:55.914085Z' + description: Searches for rule changes in Google SecOps + query: | + ( + metadata.product_event_type = "google.cloud.chronicle.v1alpha.RuleService.CreateRule" OR + metadata.product_event_type = "google.cloud.chronicle.v1alpha.RuleService.UpdateRule" OR + metadata.product_event_type = "google.cloud.chronicle.v1alpha.RuleService.UpdateRuleDeployment" OR + metadata.product_event_type = "google.cloud.chronicle.v1alpha.RuleService.DeleteRule" + ) + sharing_mode: + query_type: + placeholder_names: + placeholder_descriptions: diff --git a/tools/content_manager/content_manager/test_data/test_saved_search_config_duplicate_keys.yaml b/tools/content_manager/content_manager/test_data/test_saved_search_config_duplicate_keys.yaml new file mode 100644 index 0000000..c37b3cf --- /dev/null +++ b/tools/content_manager/content_manager/test_data/test_saved_search_config_duplicate_keys.yaml @@ -0,0 +1,48 @@ +Blocked Windows Logins by Host: + resource_name: + projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/74257fc4-cb92-497d-84be-c9b5bfcd287c + query_id: Ab9e5XWvQTu46prKxvQx6Q== + user_id: player1@example.com + create_time: '2025-11-07T16:17:40.197090Z' + update_time: '2025-11-07T16:17:40.197090Z' + description: Statistical Search Workshop + query: |- + metadata.vendor_name = "Microsoft" AND metadata.product_name = /Windows/ AND metadata.event_type = "USER_LOGIN" AND security_result.action = "BLOCK" AND principal.hostname != "" + $host = principal.hostname + $user = target.user.userid + match: + $host + outcome: + $user_distinct_count = count_distinct($user) + $user_count = count($user) + $users_uniq_list = array_distinct($user) + order: + $user_count desc, $user_distinct_count desc + sharing_mode: + query_type: + placeholder_names: + placeholder_descriptions: +Blocked Windows Logins by Host: + resource_name: + projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/74257fc4-cb92-497d-84be-c9b5bfcd287c + query_id: Ab9e5XWvQTu46prKxvQx6Q== + user_id: player1@example.com + create_time: '2025-11-07T16:17:40.197090Z' + update_time: '2025-11-07T16:17:40.197090Z' + description: Statistical Search Workshop + query: |- + metadata.vendor_name = "Microsoft" AND metadata.product_name = /Windows/ AND metadata.event_type = "USER_LOGIN" AND security_result.action = "BLOCK" AND principal.hostname != "" + $host = principal.hostname + $user = target.user.userid + match: + $host + outcome: + $user_distinct_count = count_distinct($user) + $user_count = count($user) + $users_uniq_list = array_distinct($user) + order: + $user_count desc, $user_distinct_count desc + sharing_mode: + query_type: + placeholder_names: + placeholder_descriptions: diff --git a/tools/content_manager/content_manager/test_data/test_saved_searches.json b/tools/content_manager/content_manager/test_data/test_saved_searches.json new file mode 100644 index 0000000..2a6960f --- /dev/null +++ b/tools/content_manager/content_manager/test_data/test_saved_searches.json @@ -0,0 +1,26 @@ +[ + { + "name": "projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/74257fc4-cb92-497d-84be-c9b5bfcd287c", + "metadata": { + "createTime": "2025-11-07T16:17:40.197090Z", + "updateTime": "2025-11-07T16:17:40.197090Z" + }, + "displayName": "Blocked Windows Logins by Host", + "query": "metadata.vendor_name = \"Microsoft\" AND metadata.product_name = /Windows/ AND metadata.event_type = \"USER_LOGIN\" AND security_result.action = \"BLOCK\" AND principal.hostname != \"\"\n$host = principal.hostname\n$user = target.user.userid\nmatch:\n $host\noutcome:\n $user_distinct_count = count_distinct($user)\n $user_count = count($user)\n $users_uniq_list = array_distinct($user)\norder:\n $user_count desc, $user_distinct_count desc", + "queryId": "Ab9e5XWvQTu46prKxvQx6Q==", + "userId": "player1@example.com", + "description": "Statistical Search Workshop" + }, + { + "name": "projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/1ff8cafa-4a34-47eb-b26b-c71abda0ef8b", + "metadata": { + "createTime": "2025-04-16T16:19:56.609898Z", + "updateTime": "2025-11-10T17:05:55.914085Z" + }, + "displayName": "Rule Changes", + "query": "(\nmetadata.product_event_type = \"google.cloud.chronicle.v1alpha.RuleService.CreateRule\" OR\nmetadata.product_event_type = \"google.cloud.chronicle.v1alpha.RuleService.UpdateRule\" OR\nmetadata.product_event_type = \"google.cloud.chronicle.v1alpha.RuleService.UpdateRuleDeployment\" OR\nmetadata.product_event_type = \"google.cloud.chronicle.v1alpha.RuleService.DeleteRule\"\n)\n", + "queryId": "Jy9qEghrTcAAA3vBxoymPQ==", + "userId": "player1@example.com", + "description": "Searches for rule changes in Google SecOps" + } +] diff --git a/tools/content_manager/content_manager/test_saved_searches.py b/tools/content_manager/content_manager/test_saved_searches.py new file mode 100644 index 0000000..c047d8f --- /dev/null +++ b/tools/content_manager/content_manager/test_saved_searches.py @@ -0,0 +1,222 @@ +# Copyright 2025 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +"""Tests for content_manager.saved_searches.""" + +import copy +import json +import pathlib +from typing import Any + +from content_manager.common.custom_exceptions import SavedSearchConfigError +from content_manager.saved_searches import SavedSearch +from content_manager.saved_searches import SavedSearchConfigEntry +from content_manager.saved_searches import SavedSearches +import pydantic +import pytest +import ruamel.yaml.constructor + + +ROOT_DIR = pathlib.Path(__file__).parent.parent +SAVED_SEARCHES_DIR = ROOT_DIR / "saved_searches" +SAVED_SEARCH_CONFIG_FILE = ROOT_DIR / "saved_search_config.yaml" +TEST_DATA_DIR = pathlib.Path(__file__).parent / "test_data" +TEST_SAVED_SEARCH_CONFIG_FILE = TEST_DATA_DIR / "test_saved_search_config.yaml" + +# Use ruamel.yaml to raise an exception if a YAML file contains duplicate keys +ruamel_yaml = ruamel.yaml.YAML(typ="safe") + + +@pytest.fixture(name="parsed_test_saved_searches") +def parsed_test_saved_searches_fixture() -> SavedSearches: + """Load and parse test saved searches.""" + return SavedSearches.load_saved_search_config( + saved_search_config_file=TEST_SAVED_SEARCH_CONFIG_FILE, + ) + + +@pytest.fixture(name="raw_test_saved_searches") +def raw_test_saved_searches_fixture() -> list[dict[str, Any]]: + """Return a list of raw (unparsed) saved searches.""" + test_saved_searches_file = TEST_DATA_DIR / "test_saved_searches.json" + with open(test_saved_searches_file, "r", encoding="utf-8") as f: + return json.load(f) + + +def test_load_saved_searches_config(): + """Tests for saved_searches.SavedSearches.load_saved_search_config.""" + SAVED_SEARCH_CONFIG_FILE.touch(exist_ok=True) + + +def test_parse_saved_searches( + raw_test_saved_searches: list[dict[str, Any]], +): + """Tests for saved_searches.SavedSearches.parse_saved_searches.""" + raw_saved_searches = copy.deepcopy(raw_test_saved_searches) + + # Ensure an exception occurs when attempting to parse a saved search that's + # missing a required value + del raw_saved_searches[0]["query"] + + with pytest.raises(expected_exception=KeyError, match=r"query"): + SavedSearches.parse_saved_searches(raw_saved_searches) + + +def test_saved_search(): + """Tests for saved_searches.SavedSearch.""" + # Ensure an exception occurs when attempting to create a SavedSearch object + # that's missing a required value + with pytest.raises( + expected_exception=pydantic.ValidationError, + match=r"Field required \[type=missing", + ): + SavedSearch( + name="Blocked Windows Logins by Host", + resource_name="projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/74257fc4-cb92-497d-84be-c9b5bfcd287c", + query_id="Ab9e5XWvQTu46prKxvQx6Q==", + user_id="player1@example.com", + create_time="2025-11-07T16:17:40.197090Z", + update_time="2025-11-07T16:17:40.197090Z", + description="Statistical Search Workshop", + sharing_mode=None, + query_type=None, + placeholder_names=None, + placeholder_descriptions=None, + ) + + # Ensure an exception occurs when attempting to create a SavedSearch object + # with an invalid value + with pytest.raises( + expected_exception=pydantic.ValidationError, + match=( + r"validation error for SavedSearch\nname\n Input should be a valid" + r" string" + ), + ): + SavedSearch( + name=4, + resource_name="projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/74257fc4-cb92-497d-84be-c9b5bfcd287c", + query_id="Ab9e5XWvQTu46prKxvQx6Q==", + user_id="player1@example.com", + create_time="2025-11-07T16:17:40.197090Z", + update_time="2025-11-07T16:17:40.197090Z", + description="Statistical Search Workshop", + query=( + 'metadata.vendor_name = "Microsoft" AND metadata.product_name =' + ' /Windows/ AND metadata.event_type = "USER_LOGIN" AND' + ' security_result.action = "BLOCK" AND principal.hostname !=' + ' ""\n$host = principal.hostname\n$user =' + " target.user.userid\nmatch:\n $host\noutcome:\n " + " $user_distinct_count = count_distinct($user)\n $user_count =" + " count($user)\n $users_uniq_list =" + " array_distinct($user)\norder:\n $user_count desc," + " $user_distinct_count desc" + ), + sharing_mode=None, + query_type=None, + placeholder_names=None, + placeholder_descriptions=None, + ) + + +def test_check_saved_search_config(): + """Tests for saved_searches.SavedSearches.check_saved_search_config.""" + # Ensure an exception occurs when a saved search config file contains + # duplicate keys (saved search names). + with pytest.raises(ruamel.yaml.constructor.DuplicateKeyError): + SavedSearches.load_saved_search_config( + saved_search_config_file=TEST_DATA_DIR + / "test_saved_search_config_duplicate_keys.yaml" + ) + + with open(TEST_SAVED_SEARCH_CONFIG_FILE, "r", encoding="utf-8") as f: + saved_search_config = ruamel_yaml.load(f) + + # Ensure an exception occurs when a saved search config file contains an + # invalid key + saved_search_config["Blocked Windows Logins by Host"][ + "invalid_key" + ] = "invalid" + with pytest.raises( + SavedSearchConfigError, + match=r"Invalid keys .* found for saved search - ", + ): + SavedSearches.check_saved_search_config(config=saved_search_config) + + # Ensure an exception occurs when a saved search config file is missing a + # required key + del saved_search_config["Blocked Windows Logins by Host"]["invalid_key"] + del saved_search_config["Blocked Windows Logins by Host"]["query"] + with pytest.raises( + SavedSearchConfigError, + match=r"Required key \(query\) not found for saved search - ", + ): + SavedSearches.check_saved_search_config(config=saved_search_config) + + +def test_saved_search_config_config_entry(): + """Tests for saved_searches.SavedSearchConfigEntry.""" + # Ensure an exception occurs when attempting to create a + # SavedSearchConfigEntry object that's missing a required value + with pytest.raises( + expected_exception=pydantic.ValidationError, + match=r"Field required \[type=missing", + ): + SavedSearchConfigEntry( + name="Blocked Windows Logins by Host", + resource_name="projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/74257fc4-cb92-497d-84be-c9b5bfcd287c", + query_id="Ab9e5XWvQTu46prKxvQx6Q==", + user_id="player1@example.com", + create_time="2025-11-07T16:17:40.197090Z", + update_time="2025-11-07T16:17:40.197090Z", + description="Statistical Search Workshop", + sharing_mode=None, + query_type=None, + placeholder_names=None, + placeholder_descriptions=None, + ) + + # Ensure an exception occurs when attempting to create a + # SavedSearchConfigEntry object with an invalid value + with pytest.raises( + expected_exception=pydantic.ValidationError, + match=( + r"validation error for SavedSearchConfigEntry\nsharing_mode\n " + r" Input should be " + ), + ): + SavedSearchConfigEntry( + name="Blocked Windows Logins by Host", + resource_name="projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/74257fc4-cb92-497d-84be-c9b5bfcd287c", + query_id="Ab9e5XWvQTu46prKxvQx6Q==", + user_id="player1@example.com", + create_time="2025-11-07T16:17:40.197090Z", + update_time="2025-11-07T16:17:40.197090Z", + description="Statistical Search Workshop", + query=( + 'metadata.vendor_name = "Microsoft" AND metadata.product_name =' + ' /Windows/ AND metadata.event_type = "USER_LOGIN" AND' + ' security_result.action = "BLOCK" AND principal.hostname !=' + ' ""\n$host = principal.hostname\n$user =' + " target.user.userid\nmatch:\n $host\noutcome:\n " + " $user_distinct_count = count_distinct($user)\n $user_count =" + " count($user)\n $users_uniq_list =" + " array_distinct($user)\norder:\n $user_count desc," + " $user_distinct_count desc" + ), + sharing_mode="INVALID", + query_type=None, + placeholder_names=None, + placeholder_descriptions=None, + ) diff --git a/tools/content_manager/google_secops_api/saved_searches/__init__.py b/tools/content_manager/google_secops_api/saved_searches/__init__.py new file mode 100644 index 0000000..3d87ee0 --- /dev/null +++ b/tools/content_manager/google_secops_api/saved_searches/__init__.py @@ -0,0 +1,14 @@ +# Copyright 2025 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# diff --git a/tools/content_manager/google_secops_api/saved_searches/create_saved_search.py b/tools/content_manager/google_secops_api/saved_searches/create_saved_search.py new file mode 100644 index 0000000..1345496 --- /dev/null +++ b/tools/content_manager/google_secops_api/saved_searches/create_saved_search.py @@ -0,0 +1,97 @@ +# Copyright 2025 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +"""Create a new saved search. + +API reference: +https://cloud.google.com/chronicle/docs/reference/rest/v1alpha/projects.locations.instances.users.searchQueries/create +""" + +import logging +import os +import time +from typing import Any + +from google.auth.transport import requests + +LOGGER = logging.getLogger() + + +def create_saved_search( + http_session: requests.AuthorizedSession, + name: str, + query: str, + description: str | None = None, + sharing_mode: str | None = None, + placeholder_names: list[str] | None = None, + placeholder_descriptions: list[str] | None = None, + max_retries: int = 3, +) -> dict[str, Any]: + """Creates a new saved search. + + Args: + http_session: Authorized session for HTTP requests. + name: The unique display name for the new saved search. + query: The query for the saved search. Reference - + https://cloud.google.com/chronicle/docs/reference/rest/v1alpha/projects.locations.instances.users.searchQueries#SearchQuery + description (optional): A user-provided description of the saved search. + sharing_mode (optional): The sharing mode for the saved search. Reference - + https://cloud.google.com/chronicle/docs/reference/rest/v1alpha/projects.locations.instances.users.searchQueries#SharingMode + placeholder_names (optional): A list of names for the query placeholders to + be shown in the UI. Each element's position corresponds to the + description in the placeholder_descriptions field. + placeholder_descriptions (optional): A list of descriptions for the query + placeholders to be shown in the UI. Each element's position corresponds + to the name in the placeholder_names field. + max_retries (optional): Maximum number of times to retry HTTP request if + certain response codes are returned. For example: HTTP response status + code 429 (Too Many Requests) + + Returns: + New saved search. + + Raises: + requests.exceptions.HTTPError: HTTP request resulted in an error + (response.status_code >= 400). + requests.exceptions.JSONDecodeError: If the server response is not valid + JSON. + """ + url = f"{os.environ['GOOGLE_SECOPS_API_BASE_URL']}/{os.environ['GOOGLE_SECOPS_INSTANCE']}/users/me/searchQueries" + body = { + "display_name": name, + "description": description, + "query": query, + "placeholder_names": placeholder_names, + "placeholder_descriptions": placeholder_descriptions, + "metadata": {"sharing_mode": sharing_mode}, + } + response = None + + for _ in range(max(max_retries, 0) + 1): + response = http_session.request(method="POST", url=url, json=body) + + if response.status_code >= 400: + LOGGER.warning(response.text) + + if response.status_code == 429: + LOGGER.warning( + "API rate limit exceeded. Sleeping for 60s before retrying" + ) + time.sleep(60) + else: + break + + response.raise_for_status() + + return response.json() diff --git a/tools/content_manager/google_secops_api/saved_searches/delete_saved_search.py b/tools/content_manager/google_secops_api/saved_searches/delete_saved_search.py new file mode 100644 index 0000000..4f6a207 --- /dev/null +++ b/tools/content_manager/google_secops_api/saved_searches/delete_saved_search.py @@ -0,0 +1,67 @@ +# Copyright 2025 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +"""Delete a saved search. + +API reference: +https://cloud.google.com/chronicle/docs/reference/rest/v1alpha/projects.locations.instances.users.searchQueries/delete +""" + +import os +import time +from typing import Any + +from google.auth.transport import requests + + +def delete_saved_search( + http_session: requests.AuthorizedSession, + resource_name: str, + max_retries: int = 3, +) -> dict[str, Any]: + """Deletes a saved search. + + Args: + http_session: Authorized session for HTTP requests. + resource_name: The resource name of the saved search to delete. Format: + projects/{project}/locations/{location}/instances/{instance}/users/me/searchQueries/{saved_search_id} + max_retries (optional): Maximum number of times to retry HTTP request if + certain response codes are returned. For example: HTTP response status + code 429 (Too Many Requests) + + Returns: + An empty JSON object. + + Raises: + requests.exceptions.HTTPError: HTTP request resulted in an error + (response.status_code >= 400). + """ + url = f"{os.environ['GOOGLE_SECOPS_API_BASE_URL']}/{resource_name}" + response = None + + for _ in range(max(max_retries, 0) + 1): + response = http_session.request(method="DELETE", url=url) + + if response.status_code >= 400: + print(response.text) + + if response.status_code == 429: + print("API rate limit exceeded. Sleeping for 60s before retrying") + time.sleep(60) + else: + break + + response.raise_for_status() + + return response.json() diff --git a/tools/content_manager/google_secops_api/saved_searches/get_saved_search.py b/tools/content_manager/google_secops_api/saved_searches/get_saved_search.py new file mode 100644 index 0000000..ff1da5b --- /dev/null +++ b/tools/content_manager/google_secops_api/saved_searches/get_saved_search.py @@ -0,0 +1,74 @@ +# Copyright 2025 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +"""Retrieve a saved search. + +API reference: +https://cloud.google.com/chronicle/docs/reference/rest/v1alpha/projects.locations.instances.users.searchQueries/get +""" + +import logging +import os +import time +from typing import Any + +from google.auth.transport import requests + +LOGGER = logging.getLogger() + + +def get_saved_search( + http_session: requests.AuthorizedSession, + resource_name: str, + max_retries: int = 3, +) -> dict[str, Any]: + """Retrieves a saved search. + + Args: + http_session: Authorized session for HTTP requests. + resource_name: The resource name of the saved search to retrieve. Format: + projects/{project}/locations/{location}/instances/{instance}/users/{user}/searchQueries/{saved_search_id} + max_retries (optional): Maximum number of times to retry HTTP request if + certain response codes are returned. For example: HTTP response status + code 429 (Too Many Requests) + + Returns: + Content and metadata about the requested saved search. + + Raises: + requests.exceptions.HTTPError: HTTP request resulted in an error + (response.status_code >= 400). + requests.exceptions.JSONDecodeError: If the server response is not valid + JSON. + """ + url = f"{os.environ['GOOGLE_SECOPS_API_BASE_URL']}/{resource_name}" + response = None + + for _ in range(max(max_retries, 0) + 1): + response = http_session.request(method="GET", url=url) + + if response.status_code >= 400: + LOGGER.warning(response.text) + + if response.status_code == 429: + LOGGER.warning( + "API rate limit exceeded. Sleeping for 60s before retrying" + ) + time.sleep(60) + else: + break + + response.raise_for_status() + + return response.json() diff --git a/tools/content_manager/google_secops_api/saved_searches/list_saved_searches.py b/tools/content_manager/google_secops_api/saved_searches/list_saved_searches.py new file mode 100644 index 0000000..4d5aa7e --- /dev/null +++ b/tools/content_manager/google_secops_api/saved_searches/list_saved_searches.py @@ -0,0 +1,83 @@ +# Copyright 2025 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +"""Retrieve a list of saved searches for the current user. + +API reference: +https://cloud.google.com/chronicle/docs/reference/rest/v1alpha/projects.locations.instances.users.searchQueries/list +""" + +import logging +import os +import time +from typing import Any + +from google.auth.transport import requests + +LOGGER = logging.getLogger() + + +def list_saved_searches( + http_session: requests.AuthorizedSession, + page_size: int | None = None, + page_token: str | None = None, + max_retries: int = 3, +) -> tuple[list[dict[str, Any]], str]: + """Retrieve a list of saved searches for the current user. + + Args: + http_session: Authorized session for HTTP requests. + page_size (optional): Maximum number of saved searches to return. Must be + non-negative, and is capped at a server-side limit of 1000. A + server-side default of 100 is used if the size is 0 or a None value. + page_token (optional): Page token from a previous call used for pagination. + The first page is retrieved if the token is the empty string + or a None value. + max_retries (optional): Maximum number of times to retry HTTP request if + certain response codes are returned. For example: HTTP response status + code 429 (Too Many Requests) + + Returns: + List of saved searches and a page token for the next page of saved searches, + if there are any. + + Raises: + requests.exceptions.HTTPError: HTTP request resulted in an error + (response.status_code >= 400). + requests.exceptions.JSONDecodeError: If the server response is not valid + JSON. + """ + url = f"{os.environ['GOOGLE_SECOPS_API_BASE_URL']}/{os.environ['GOOGLE_SECOPS_INSTANCE']}/users/me/searchQueries" + params = {"page_size": page_size, "page_token": page_token} + response = None + + for _ in range(max(max_retries, 0) + 1): + response = http_session.request(method="GET", url=url, params=params) + + if response.status_code >= 400: + LOGGER.warning(response.text) + + if response.status_code == 429: + LOGGER.warning( + "API rate limit exceeded. Sleeping for 60s before retrying" + ) + time.sleep(60) + else: + break + + response.raise_for_status() + + response_json = response.json() + + return response_json.get("searchQueries"), response_json.get("nextPageToken") diff --git a/tools/content_manager/google_secops_api/saved_searches/test_create_saved_search.py b/tools/content_manager/google_secops_api/saved_searches/test_create_saved_search.py new file mode 100644 index 0000000..20035f3 --- /dev/null +++ b/tools/content_manager/google_secops_api/saved_searches/test_create_saved_search.py @@ -0,0 +1,98 @@ +# Copyright 2025 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +"""Unit tests for the "create_saved_search" module.""" + +import unittest +from unittest import mock + +from google.auth.transport import requests +from google_secops_api.saved_searches.create_saved_search import create_saved_search + + +class CreateSavedSearchTest(unittest.TestCase): + """Unit tests for the "create_saved_search" module.""" + + @mock.patch.object( + target=requests, attribute="AuthorizedSession", autospec=True + ) + @mock.patch.object( + target=requests.requests, attribute="Response", autospec=True + ) + def test_http_error( + self, + mock_response: unittest.mock.MagicMock, + mock_session: unittest.mock.MagicMock, + ): + """Test that an HTTP error occurs.""" + mock_session.request.return_value = mock_response + type(mock_response).status_code = mock.PropertyMock(return_value=400) + mock_response.raise_for_status.side_effect = ( + requests.requests.exceptions.HTTPError() + ) + + with self.assertRaises(requests.requests.exceptions.HTTPError): + create_saved_search( + http_session=mock_session, + name="Windows Login Events", + description="Windows login events", + query=( + 'metadata.vendor_name = "Microsoft" AND metadata.product_name =' + ' /Windows/ AND metadata.event_type = "USER_LOGIN"' + ), + ) + + @mock.patch.object( + target=requests, attribute="AuthorizedSession", autospec=True + ) + @mock.patch.object( + target=requests.requests, attribute="Response", autospec=True + ) + def test_http_ok( + self, + mock_response: unittest.mock.MagicMock, + mock_session: unittest.mock.MagicMock, + ): + """Test that HTTP response 200 (OK) occurs.""" + mock_session.request.return_value = mock_response + type(mock_response).status_code = mock.PropertyMock(return_value=200) + expected_saved_search = { + "name": ( + "projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/baf471b7-067f-4a73-91c4-8c3ff0c0c29c" + ), + "metadata": { + "createTime": "2025-11-10T16:27:10.139428Z", + "updateTime": "2025-11-10T16:27:10.139428Z", + }, + "displayName": "Windows Login Events", + "query": ( + 'metadata.vendor_name = "Microsoft" AND metadata.product_name =' + ' /Windows/ AND metadata.event_type = "USER_LOGIN"' + ), + "queryId": "Ab9e5XWvQTu46prKxvQx6Q==", + "userId": "player1@example.com", + "description": "Windows user login events", + } + + mock_response.json.return_value = expected_saved_search + actual_saved_search = create_saved_search( + http_session=mock_session, + name="Windows Login Events", + description="Windows login events", + query=( + 'metadata.vendor_name = "Microsoft" AND metadata.product_name =' + ' /Windows/ AND metadata.event_type = "USER_LOGIN"' + ), + ) + self.assertEqual(actual_saved_search, expected_saved_search) diff --git a/tools/content_manager/google_secops_api/saved_searches/test_delete_saved_search.py b/tools/content_manager/google_secops_api/saved_searches/test_delete_saved_search.py new file mode 100644 index 0000000..7438465 --- /dev/null +++ b/tools/content_manager/google_secops_api/saved_searches/test_delete_saved_search.py @@ -0,0 +1,68 @@ +# Copyright 2025 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +"""Unit tests for the "delete_saved_search" module.""" + +import unittest +from unittest import mock + +from google.auth.transport import requests +from google_secops_api.saved_searches.delete_saved_search import delete_saved_search + + +class DeleteSavedSearchTest(unittest.TestCase): + """Unit tests for the "delete_saved_search" module.""" + + @mock.patch.object( + target=requests, attribute="AuthorizedSession", autospec=True + ) + @mock.patch.object( + target=requests.requests, attribute="Response", autospec=True + ) + def test_http_error( + self, + mock_response: unittest.mock.MagicMock, + mock_session: unittest.mock.MagicMock, + ): + """Test that an HTTP error occurs.""" + mock_session.request.return_value = mock_response + type(mock_response).status_code = mock.PropertyMock(return_value=400) + mock_response.raise_for_status.side_effect = ( + requests.requests.exceptions.HTTPError() + ) + + with self.assertRaises(requests.requests.exceptions.HTTPError): + delete_saved_search( + http_session=mock_session, + resource_name="projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/baf471b7-067f-4a73-91c4-8c3ff0c0c29c", + ) + + @mock.patch.object( + target=requests, attribute="AuthorizedSession", autospec=True + ) + @mock.patch.object( + target=requests.requests, attribute="Response", autospec=True + ) + def test_http_ok( + self, + mock_response: unittest.mock.MagicMock, + mock_session: unittest.mock.MagicMock, + ): + """Test that HTTP response 200 (OK) occurs.""" + mock_session.request.return_value = mock_response + type(mock_response).status_code = mock.PropertyMock(return_value=200) + delete_saved_search( + http_session=mock_session, + resource_name="projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/baf471b7-067f-4a73-91c4-8c3ff0c0c29c", + ) diff --git a/tools/content_manager/google_secops_api/saved_searches/test_get_saved_search.py b/tools/content_manager/google_secops_api/saved_searches/test_get_saved_search.py new file mode 100644 index 0000000..8318802 --- /dev/null +++ b/tools/content_manager/google_secops_api/saved_searches/test_get_saved_search.py @@ -0,0 +1,88 @@ +# Copyright 2025 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +"""Unit tests for the "get_saved_search" module.""" + +import unittest +from unittest import mock + +from google.auth.transport import requests +from google_secops_api.saved_searches.get_saved_search import get_saved_search + + +class GetSavedSearchTest(unittest.TestCase): + """Unit tests for the "get_saved_search" module.""" + + @mock.patch.object( + target=requests, attribute="AuthorizedSession", autospec=True + ) + @mock.patch.object( + target=requests.requests, attribute="Response", autospec=True + ) + def test_http_error( + self, + mock_response: unittest.mock.MagicMock, + mock_session: unittest.mock.MagicMock, + ): + """Test that an HTTP error occurs.""" + mock_session.request.return_value = mock_response + type(mock_response).status_code = mock.PropertyMock(return_value=400) + mock_response.raise_for_status.side_effect = ( + requests.requests.exceptions.HTTPError() + ) + + with self.assertRaises(requests.requests.exceptions.HTTPError): + get_saved_search( + http_session=mock_session, + resource_name="projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/baf471b7-067f-4a73-91c4-8c3ff0c0c29c", + ) + + @mock.patch.object( + target=requests, attribute="AuthorizedSession", autospec=True + ) + @mock.patch.object( + target=requests.requests, attribute="Response", autospec=True + ) + def test_http_ok( + self, + mock_response: unittest.mock.MagicMock, + mock_session: unittest.mock.MagicMock, + ): + """Test that HTTP response 200 (OK) occurs.""" + mock_session.request.return_value = mock_response + type(mock_response).status_code = mock.PropertyMock(return_value=200) + expected_saved_search = { + "name": ( + "projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/baf471b7-067f-4a73-91c4-8c3ff0c0c29c" + ), + "metadata": { + "createTime": "2025-11-10T16:27:10.139428Z", + "updateTime": "2025-11-10T16:27:10.139428Z", + }, + "displayName": "Windows Login Events", + "query": ( + 'metadata.vendor_name = "Microsoft" AND metadata.product_name =' + ' /Windows/ AND metadata.event_type = "USER_LOGIN"' + ), + "queryId": "Ab9e5XWvQTu46prKxvQx6Q==", + "userId": "player1@example.com", + "description": "Windows user login events", + } + mock_response.json.return_value = expected_saved_search + + response = get_saved_search( + http_session=mock_session, + resource_name="projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/baf471b7-067f-4a73-91c4-8c3ff0c0c29c", + ) + self.assertEqual(response, expected_saved_search) diff --git a/tools/content_manager/google_secops_api/saved_searches/test_list_saved_searches.py b/tools/content_manager/google_secops_api/saved_searches/test_list_saved_searches.py new file mode 100644 index 0000000..74676c4 --- /dev/null +++ b/tools/content_manager/google_secops_api/saved_searches/test_list_saved_searches.py @@ -0,0 +1,90 @@ +# Copyright 2025 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +"""Unit tests for the "list_saved_searches" module.""" + +import unittest +from unittest import mock + +from google.auth.transport import requests +from google_secops_api.saved_searches.list_saved_searches import list_saved_searches + + +class ListSavedSearchesTest(unittest.TestCase): + """Unit tests for the "list_saved_searches" module.""" + + @mock.patch.object( + target=requests, attribute="AuthorizedSession", autospec=True + ) + @mock.patch.object( + target=requests.requests, attribute="Response", autospec=True + ) + def test_http_error( + self, + mock_response: unittest.mock.MagicMock, + mock_session: unittest.mock.MagicMock, + ): + """Test that an HTTP error occurs.""" + mock_session.request.return_value = mock_response + type(mock_response).status_code = mock.PropertyMock(return_value=400) + mock_response.raise_for_status.side_effect = ( + requests.requests.exceptions.HTTPError() + ) + + with self.assertRaises(requests.requests.exceptions.HTTPError): + list_saved_searches(http_session=mock_session) + + @mock.patch.object( + target=requests, attribute="AuthorizedSession", autospec=True + ) + @mock.patch.object( + target=requests.requests, attribute="Response", autospec=True + ) + def test_http_ok( + self, + mock_response: unittest.mock.MagicMock, + mock_session: unittest.mock.MagicMock, + ): + """Test that HTTP response 200 (OK) occurs.""" + mock_session.request.return_value = mock_response + type(mock_response).status_code = mock.PropertyMock(return_value=200) + expected_saved_search = { + "name": ( + "projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/baf471b7-067f-4a73-91c4-8c3ff0c0c29c" + ), + "metadata": { + "createTime": "2025-11-10T16:27:10.139428Z", + "updateTime": "2025-11-10T16:27:10.139428Z", + }, + "displayName": "Windows Login Events", + "query": ( + 'metadata.vendor_name = "Microsoft" AND metadata.product_name =' + ' /Windows/ AND metadata.event_type = "USER_LOGIN"' + ), + "queryId": "Ab9e5XWvQTu46prKxvQx6Q==", + "userId": "player1@example.com", + "description": "Windows user login events", + } + expected_page_token = "page token here" + mock_response.json.return_value = { + "searchQueries": [expected_saved_search], + "nextPageToken": expected_page_token, + } + + saved_searches, next_page_token = list_saved_searches( + http_session=mock_session + ) + self.assertEqual(len(saved_searches), 1) + self.assertEqual(saved_searches[0], expected_saved_search) + self.assertEqual(next_page_token, expected_page_token) diff --git a/tools/content_manager/google_secops_api/saved_searches/test_update_saved_search.py b/tools/content_manager/google_secops_api/saved_searches/test_update_saved_search.py new file mode 100644 index 0000000..ecf147a --- /dev/null +++ b/tools/content_manager/google_secops_api/saved_searches/test_update_saved_search.py @@ -0,0 +1,104 @@ +# Copyright 2025 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +"""Unit tests for the "update_saved_search" module.""" + +import unittest +from unittest import mock + +from google.auth.transport import requests +from google_secops_api.saved_searches.update_saved_search import update_saved_search + + +class UpdateSavedSearchTest(unittest.TestCase): + """Unit tests for the "update_saved_search" module.""" + + @mock.patch.object( + target=requests, attribute="AuthorizedSession", autospec=True + ) + @mock.patch.object( + target=requests.requests, attribute="Response", autospec=True + ) + def test_http_error( + self, + mock_response: unittest.mock.MagicMock, + mock_session: unittest.mock.MagicMock, + ): + """Test that an HTTP error occurs.""" + mock_session.request.return_value = mock_response + type(mock_response).status_code = mock.PropertyMock(return_value=400) + mock_response.raise_for_status.side_effect = ( + requests.requests.exceptions.HTTPError() + ) + + with self.assertRaises(requests.requests.exceptions.HTTPError): + update_saved_search( + http_session=mock_session, + resource_name="projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/baf471b7-067f-4a73-91c4-8c3ff0c0c29c", + updates={ + "displayName": "Windows Login Events", + "description": "Windows login events", + "query": ( + 'metadata.vendor_name = "Microsoft" AND metadata.product_name' + ' = /Windows/ AND metadata.event_type = "USER_LOGIN"' + ), + }, + ) + + @mock.patch.object( + target=requests, attribute="AuthorizedSession", autospec=True + ) + @mock.patch.object( + target=requests.requests, attribute="Response", autospec=True + ) + def test_http_ok( + self, + mock_response: unittest.mock.MagicMock, + mock_session: unittest.mock.MagicMock, + ): + """Test that HTTP response 200 (OK) occurs.""" + mock_session.request.return_value = mock_response + type(mock_response).status_code = mock.PropertyMock(return_value=200) + expected_saved_search = { + "name": ( + "projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/baf471b7-067f-4a73-91c4-8c3ff0c0c29c" + ), + "metadata": { + "createTime": "2025-11-10T16:27:10.139428Z", + "updateTime": "2025-11-10T16:27:10.139428Z", + }, + "displayName": "Windows Login Events", + "query": ( + 'metadata.vendor_name = "Microsoft" AND metadata.product_name =' + ' /Windows/ AND metadata.event_type = "USER_LOGIN"' + ), + "queryId": "Ab9e5XWvQTu46prKxvQx6Q==", + "userId": "player1@example.com", + "description": "Windows user login events", + } + mock_response.json.return_value = expected_saved_search + + new_saved_search_version = update_saved_search( + http_session=mock_session, + resource_name="projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/baf471b7-067f-4a73-91c4-8c3ff0c0c29c", + updates={ + "displayName": "Windows Login Events", + "description": "Windows login events", + "query": ( + 'metadata.vendor_name = "Microsoft" AND metadata.product_name =' + ' /Windows/ AND metadata.event_type = "USER_LOGIN"' + ), + }, + ) + self.assertEqual(new_saved_search_version, expected_saved_search) diff --git a/tools/content_manager/google_secops_api/saved_searches/update_saved_search.py b/tools/content_manager/google_secops_api/saved_searches/update_saved_search.py new file mode 100644 index 0000000..c5a04d7 --- /dev/null +++ b/tools/content_manager/google_secops_api/saved_searches/update_saved_search.py @@ -0,0 +1,92 @@ +# Copyright 2025 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# +"""Update an existing saved search. + +API reference: +https://cloud.google.com/chronicle/docs/reference/rest/v1alpha/projects.locations.instances.users.searchQueries/patch +""" + +import logging +import os +import time +from typing import Any + +from google.auth.transport import requests + + +LOGGER = logging.getLogger() + + +def update_saved_search( + http_session: requests.AuthorizedSession, + resource_name: str, + updates: dict[str, Any], + update_mask: list[str] | None = None, + max_retries: int = 3, +) -> dict[str, Any]: + """Updates an existing saved search. + + Args: + http_session: Authorized session for HTTP requests. + resource_name: The resource name of the saved search to update. Format: + projects/{project}/locations/{location}/instances/{instance}/users/me/searchQueries/{saved_search_uuid} + updates: A dictionary containing the updates to make to the saved search + Example - A value of {"description": "My new saved search description"} + will update the description for the saved search accordingly. + update_mask (optional): The list of fields to update for the saved search. + If no update_mask is provided, all non-empty fields will be updated. + Example - An update_mask of ["description"] will update the description + for the saved search. + max_retries (optional): Maximum number of times to retry HTTP request if + certain response codes are returned. For example: HTTP response status + code 429 (Too Many Requests) + + Returns: + New version of the saved search. + + Raises: + requests.exceptions.HTTPError: HTTP request resulted in an error + (response.status_code >= 400). + requests.exceptions.JSONDecodeError: If the server response is not valid + JSON. + """ + url = f"{os.environ['GOOGLE_SECOPS_API_BASE_URL']}/{resource_name}" + response = None + + # If no update_mask is provided, all non-empty fields will be updated + if update_mask is None: + params = {} + else: + params = {"updateMask": update_mask} + + for _ in range(max(max_retries, 0) + 1): + response = http_session.request( + method="PATCH", url=url, params=params, json=updates + ) + + if response.status_code >= 400: + LOGGER.warning(response.text) + + if response.status_code == 429: + LOGGER.warning( + "API rate limit exceeded. Sleeping for 60s before retrying" + ) + time.sleep(60) + else: + break + + response.raise_for_status() + + return response.json() diff --git a/tools/content_manager/saved_search_config.yaml b/tools/content_manager/saved_search_config.yaml new file mode 100644 index 0000000..3ae5ca9 --- /dev/null +++ b/tools/content_manager/saved_search_config.yaml @@ -0,0 +1,42 @@ +Blocked Windows Logins by Host: + resource_name: + projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/baf471b7-067f-4a73-91c4-8c3ff0c0c29c + query_id: tj120P6AQT6HjH4NABE123== + user_id: seceng1@example.com + create_time: '2025-11-07T16:17:40.197090Z' + update_time: '2025-11-10T21:16:39.194891Z' + description: Statistical Search Workshop + query: |- + metadata.vendor_name = "Microsoft" AND metadata.product_name = /Windows/ AND metadata.event_type = "USER_LOGIN" AND security_result.action = "BLOCK" AND principal.hostname != "" + $host = principal.hostname + $user = target.user.userid + match: + $host + outcome: + $user_distinct_count = count_distinct($user) + $user_count = count($user) + $users_uniq_list = array_distinct($user) + order: + $user_count desc, $user_distinct_count desc + sharing_mode: MODE_SHARED_WITH_CUSTOMER + query_type: QUERY_TYPE_STATS_QUERY + placeholder_names: + placeholder_descriptions: +Zeek - Investigative Search: + resource_name: + projects/1234567891234/locations/us/instances/3f0ac524-5ae1-4bfd-b86d-53afc953e7e6/users/me/searchQueries/baf471b7-067f-4a73-91c4-8c3ff0c0c30a + query_id: 1ZVJ/TSsSW2PdzNF+Z4c1h== + user_id: seceng1@example.com + create_time: '2025-11-07T15:38:05.403802Z' + update_time: '2025-11-10T21:17:40.759289Z' + description: Advanced Search Workshop - Search Templates Exercise + query: metadata.vendor_name = "Zeek" AND network.application_protocol = + ${protocol} and principal.ip = ${originating_ip} + sharing_mode: MODE_SHARED_WITH_CUSTOMER + query_type: QUERY_TYPE_UDM_QUERY + placeholder_names: + - ${protocol} + - ${originating_ip} + placeholder_descriptions: + - Enter the network traffic protocol + - Enter the IP Address of a system where traffic originated from