-
Notifications
You must be signed in to change notification settings - Fork 9
chore(RHOAIENG-34909): Replace remote_provider_spec with RemoteProviderSpec #61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore(RHOAIENG-34909): Replace remote_provider_spec with RemoteProviderSpec #61
Conversation
Reviewer's guide (collapsed on small PRs)Reviewer's GuideReplaces the use of the remote_provider_spec helper with a direct RemoteProviderSpec instantiation and bumps the llama-stack dependency to version 0.2.22. Class diagram for updated provider spec instantiationclassDiagram
class RemoteProviderSpec {
+Api api
+str adapter_type
+list pip_packages
+str config_class
+str module
}
class ProviderSpec
RemoteProviderSpec --|> ProviderSpec
class Api {
<<enumeration>>
eval
}
RemoteProviderSpec : api = Api.eval
RemoteProviderSpec : adapter_type = "lmeval"
RemoteProviderSpec : pip_packages = ["kubernetes"]
RemoteProviderSpec : config_class = "llama_stack_provider_lmeval.config.LMEvalEvalProviderConfig"
RemoteProviderSpec : module = "llama_stack_provider_lmeval"
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
|
Seems CI is failing due to missing credentials in the repo secrets https://github.com/trustyai-explainability/llama-stack-provider-lmeval/actions/runs/18135271893/job/51612245285?pr=61 |
pyproject.toml
Outdated
| requires-python = ">=3.12" | ||
| dependencies = [ | ||
| "llama-stack>=0.2.14", | ||
| "llama-stack>=0.2.22", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why not 0.2.23?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@nathan-weinberg double checking, but are we supposed to pin the LLS versions ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, pinning is fine
cdoern
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
awesome, thank you @christinaexyou !
I think you need provider_type as well, right?
60b2fa8 to
834f744
Compare
834f744 to
153c1f0
Compare
|
Your CI is failing on pre-commit - you'll want to run it locally and commit those changes from I opened #63 around the other CI check since that seems to fail the majority of the time |
ruivieira
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@christinaexyou could you just fix the provider spec indentation?
Otherwise, LGTM.
Summary by Sourcery
Refactor provider definition to use RemoteProviderSpec constructor and update the llama-stack dependency version.
Enhancements:
Build: