Skip to content

Conversation

@roaga
Copy link
Member

@roaga roaga commented Nov 22, 2025

Adds the ability to configure intelligence in the explorer client. We don't expose specific LLM configs but allow the dev to pick between low, medium, and high (reason for this is LLM landscape is constantly changing and there's a lot of stuff we don't want the average dev in sentry to think about; it'll be up to the seer platform to decide the best models for each level).

Also tacking on some comment cleanup.

Requires https://github.com/getsentry/seer/pull/4098. Closes AIML-1660

@github-actions github-actions bot added the Scope: Backend Automatically applied to PRs that change backend components label Nov 22, 2025
@linear
Copy link

linear bot commented Nov 22, 2025

@codecov
Copy link

codecov bot commented Nov 22, 2025

Codecov Report

❌ Patch coverage is 90.90909% with 1 line in your changes missing coverage. Please review.
✅ All tests successful. No failed tests found.

Files with missing lines Patch % Lines
src/sentry/seer/explorer/client.py 90.90% 1 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##           master   #103873      +/-   ##
===========================================
+ Coverage   80.58%    80.59%   +0.01%     
===========================================
  Files        9292      9296       +4     
  Lines      396718    396887     +169     
  Branches    25286     25286              
===========================================
+ Hits       319681    319870     +189     
+ Misses      76577     76557      -20     
  Partials      460       460              

@roaga roaga marked this pull request as ready for review November 24, 2025 12:59
@roaga roaga requested a review from a team as a code owner November 24, 2025 12:59
self.user = user
self.artifact_schema = artifact_schema
self.custom_tools = custom_tools or []
self.intelligence_level = intelligence_level
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So do we pick the model on the based on the intelligence_level level? Like 2.5 Flash-lite for low or Sonnet 4.5 / Opus 4 for high?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah

@roaga roaga merged commit c6175a4 into master Nov 24, 2025
67 checks passed
@roaga roaga deleted the explorer/intel-level-client branch November 24, 2025 19:01
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Scope: Backend Automatically applied to PRs that change backend components

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants