Skip to content

Conversation

@christinaexyou
Copy link
Collaborator

@christinaexyou christinaexyou commented Sep 30, 2025

Summary by Sourcery

Refactor provider definition to use RemoteProviderSpec constructor and update the llama-stack dependency version.

Enhancements:

  • Replace remote_provider_spec function with RemoteProviderSpec class for provider configuration

Build:

  • Bump llama-stack dependency from >=0.2.14 to >=0.2.22

@sourcery-ai
Copy link
Contributor

sourcery-ai bot commented Sep 30, 2025

Reviewer's guide (collapsed on small PRs)

Reviewer's Guide

Replaces the use of the remote_provider_spec helper with a direct RemoteProviderSpec instantiation and bumps the llama-stack dependency to version 0.2.22.

Class diagram for updated provider spec instantiation

classDiagram
    class RemoteProviderSpec {
        +Api api
        +str adapter_type
        +list pip_packages
        +str config_class
        +str module
    }
    class ProviderSpec
    RemoteProviderSpec --|> ProviderSpec
    class Api {
        <<enumeration>>
        eval
    }
    RemoteProviderSpec : api = Api.eval
    RemoteProviderSpec : adapter_type = "lmeval"
    RemoteProviderSpec : pip_packages = ["kubernetes"]
    RemoteProviderSpec : config_class = "llama_stack_provider_lmeval.config.LMEvalEvalProviderConfig"
    RemoteProviderSpec : module = "llama_stack_provider_lmeval"
Loading

File-Level Changes

Change Details Files
Switch from remote_provider_spec function to RemoteProviderSpec constructor
  • Replaced import of remote_provider_spec with RemoteProviderSpec
  • Changed get_provider_spec() to return a RemoteProviderSpec instance
  • Removed AdapterSpec wrapper and inlined adapter parameters
src/llama_stack_provider_lmeval/provider.py
Update llama-stack dependency version
  • Bumped llama-stack requirement from 0.2.14 to 0.2.22
pyproject.toml

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey there - I've reviewed your changes and they look great!


Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

@nathan-weinberg
Copy link
Contributor

Seems CI is failing due to missing credentials in the repo secrets https://github.com/trustyai-explainability/llama-stack-provider-lmeval/actions/runs/18135271893/job/51612245285?pr=61

pyproject.toml Outdated
requires-python = ">=3.12"
dependencies = [
"llama-stack>=0.2.14",
"llama-stack>=0.2.22",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not 0.2.23?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@nathan-weinberg double checking, but are we supposed to pin the LLS versions ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, pinning is fine

Copy link

@cdoern cdoern left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

awesome, thank you @christinaexyou !

I think you need provider_type as well, right?

@christinaexyou christinaexyou force-pushed the RHOAIENG-34909-remote-provider-spec branch from 60b2fa8 to 834f744 Compare October 2, 2025 02:50
@ruivieira ruivieira added the enhancement New feature or request label Oct 6, 2025
@ruivieira ruivieira moved this to In Review in TrustyAI planning Oct 6, 2025
@ruivieira ruivieira linked an issue Oct 6, 2025 that may be closed by this pull request
@christinaexyou christinaexyou force-pushed the RHOAIENG-34909-remote-provider-spec branch from 834f744 to 153c1f0 Compare October 6, 2025 20:02
@nathan-weinberg
Copy link
Contributor

Your CI is failing on pre-commit - you'll want to run it locally and commit those changes from ruff

I opened #63 around the other CI check since that seems to fail the majority of the time

Copy link
Member

@ruivieira ruivieira left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@christinaexyou could you just fix the provider spec indentation?

Otherwise, LGTM.

@ruivieira ruivieira merged commit d37ad96 into trustyai-explainability:main Oct 7, 2025
7 of 8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

Status: In Review

Development

Successfully merging this pull request may close these issues.

Prepare LMEval provider for LLS core 0.2.23

4 participants