-
-
Notifications
You must be signed in to change notification settings - Fork 4.5k
chore(aci milestone 3): open period -> incident serializer #103131
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
| alert_rule_detectors = AlertRuleDetector.objects.filter( | ||
| detector__in=list(open_periods_to_detectors.values()) | ||
| ).values_list("alert_rule_id", "detector_id") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
High severity vulnerability may affect your project—review required:
Line 73 lists a dependency (django) with a known High severity vulnerability.
ℹ️ Why this matters
Affected versions of Django are vulnerable to Improper Neutralization of Special Elements used in an SQL Command ('SQL Injection'). SQL injection in Django's ORM column aliases: when using QuerySet.annotate(), QuerySet.alias(), QuerySet.aggregate(), or QuerySet.extra() with dictionary expansion (**kwargs), the dictionary keys are used unescaped as SQL column aliases. On MySQL and MariaDB backends, an attacker who can influence those keys (for example, by passing a crafted dict of annotations) can inject arbitrary SQL into the generated query.
To resolve this comment:
Check if you are using Django with MySQL or MariaDB.
- If you're affected, upgrade this dependency to at least version 5.2.7 at uv.lock.
- If you're not affected, comment
/fp we don't use this [condition]
💬 Ignore this finding
To ignore this, reply with:
/fp <comment>for false positive/ar <comment>for acceptable risk/other <comment>for all other reasons
You can view more details on this finding in the Semgrep AppSec Platform here.
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #103131 +/- ##
===========================================
+ Coverage 78.93% 80.67% +1.74%
===========================================
Files 9137 9145 +8
Lines 392766 393299 +533
Branches 24966 24966
===========================================
+ Hits 310027 317313 +7286
+ Misses 82338 75585 -6753
Partials 401 401 |
| ), | ||
| "type": IncidentType.ALERT_TRIGGERED.value, | ||
| "title": obj.group.title, | ||
| "dateStarted": obj.date_started, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Technically, this should be obj.date_started - one snuba time window. I felt that introducing the snuba stuff to a serializer that doesn't use it elsewhere was unnecessarily complicated. Same for dateClosed.
| open_periods_to_detectors = {} | ||
| for group in group_to_open_periods: | ||
| for op in group_to_open_periods[group]: | ||
| open_periods_to_detectors[op] = groups_to_detectors[group] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bug: Missing Data Breaks Group Processing
The get_open_periods_to_detectors method raises a KeyError when a group lacks a DetectorGroup record. Line 129 accesses groups_to_detectors[group] without checking if the key exists, which only contains groups with existing DetectorGroup relationships from the database query. This causes the serializer to fail for groups without detectors instead of gracefully handling the missing relationship.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fair, I suppose we can check for the existence first before adding to the dict
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@mifu67 did you wanna address this?
ceorourke
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall looks good. Agree w/ one of the bot comments + a little test nit
| open_periods_to_detectors = {} | ||
| for group in group_to_open_periods: | ||
| for op in group_to_open_periods[group]: | ||
| open_periods_to_detectors[op] = groups_to_detectors[group] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fair, I suppose we can check for the existence first before adding to the dict
tests/sentry/incidents/serializers/test_workflow_engine_incident.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bug: Data discovery crashes with no available sources.
The _build_discover_query method raises an unhandled IndexError when data_source_detector.detector.data_sources.all() returns an empty queryset. The code accesses [0] without checking if the queryset has elements, and the QuerySubscription.DoesNotExist exception handler doesn't catch IndexError, causing the serializer to crash instead of returning an empty string.
src/sentry/incidents/endpoints/serializers/workflow_engine_incident.py#L182-L189
sentry/src/sentry/incidents/endpoints/serializers/workflow_engine_incident.py
Lines 182 to 189 in c9f7f8b
| # The query we should use to get accurate results in Discover. | |
| return DetailedIncidentSerializerResponse( | |
| **base_context, discoverQuery=self._build_discover_query(obj) | |
| ) | |
| def _build_discover_query(self, open_period: GroupOpenPeriod) -> str: | |
| detector = self.get_open_periods_to_detectors([open_period])[open_period] | |
| try: |
Previously, this serializer took in an open period, fetched the incident via the IGOP lookup table, and serialized the incident. With the incident no longer guaranteed to exist, create a serializer that will populate an incident response using only the open period model.