-
-
Notifications
You must be signed in to change notification settings - Fork 61
feat(search_issues): Add flags field to search_issues dataset #7757
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -21,6 +21,7 @@ | |
| extract_extra_contexts, | ||
| extract_extra_tags, | ||
| extract_http, | ||
| extract_nested, | ||
| extract_user, | ||
| ) | ||
| from snuba.datasets.processors import DatasetMessageProcessor | ||
|
|
@@ -66,6 +67,7 @@ class IssueEventData(TypedDict, total=False): | |
| start_timestamp: float | ||
|
|
||
| tags: Mapping[str, Any] | ||
| flags: Mapping[str, Any] | ||
| user: Mapping[str, Any] # user_hash, user_id, user_name, user_email, ip_address | ||
| sdk: Mapping[str, Any] # sdk_name, sdk_version | ||
| contexts: Mapping[str, Any] | ||
|
|
@@ -157,6 +159,18 @@ def _process_tags( | |
| promoted_tags.get("sentry:dist", event_data.get("dist")), | ||
| ) | ||
|
|
||
| def _process_flags( | ||
| self, event_data: IssueEventData, processed: MutableMapping[str, Any] | ||
| ) -> None: | ||
| existing_flags = event_data.get("flags", None) | ||
| flags: Mapping[str, Any] = _as_dict_safe(cast(Dict[str, Any], existing_flags)) | ||
| if not existing_flags: | ||
| processed["flags.key"], processed["flags.value"] = [], [] | ||
| else: | ||
| processed["flags.key"], processed["flags.value"] = extract_nested( | ||
| flags, lambda s: _unicodify(s) or None | ||
| ) | ||
|
|
||
|
Comment on lines
+163
to
+173
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Bug: The PR adds logic to process flags for search issues but is missing the required ClickHouse migration to add the Suggested FixAdd a new database migration for the Prompt for AI AgentDid we get this right? 👍 / 👎 to inform future reviews. |
||
| def _process_request_data( | ||
| self, event_data: IssueEventData, processed: MutableMapping[str, Any] | ||
| ) -> None: | ||
|
|
@@ -291,6 +305,7 @@ def process_insert_v1( | |
| self._process_tags( | ||
| event_data, fields | ||
| ) # environment, release, dist, user, tags.key, tags.value | ||
| self._process_flags(event_data, fields) # flags.key, flags.value | ||
| self._process_user( | ||
| event_data, fields | ||
| ) # user_name, user_id, user_email, ip_address_v4/ip_address_v6 | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Flags extracted from wrong location with wrong format
High Severity
The
_process_flagsmethod reads flags fromevent_data["flags"]as a flat dict, but Sentry's event payload stores feature flags insidedata.contexts.flags.valuesas a list of{"flag": "name", "result": value}objects. This is confirmed by the Rust errors processor and its tests intest_errors_processor.py. Since there's noflagskey at the top level of event data,event_data.get("flags", None)will always returnNone, andflags.key/flags.valuewill always be empty lists — making the flags feature silently non-functional for search issues.Additional Locations (1)
snuba/datasets/processors/search_issues_processor.py#L69-L70