Skip to content

fix: Error codes needs to be documented in Swagger#2480

Draft
William-Hill wants to merge 1 commit intoConduitIO:mainfrom
William-Hill:agent/fix-576-aider-armD-run1-20260413T151711Z
Draft

fix: Error codes needs to be documented in Swagger#2480
William-Hill wants to merge 1 commit intoConduitIO:mainfrom
William-Hill:agent/fix-576-aider-armD-run1-20260413T151711Z

Conversation

@William-Hill
Copy link
Copy Markdown

Fixes #576

Agent Summary

/openapi.go to the chat.
Added pkg/pipeline/errors.go to the chat.
Added pkg/pipeline/service.go to the chat.
Added proto/api/v1/api.pb.gw.go to the chat.
Added proto/api/v1/api.proto to the chat.

Makefile

cmd/conduit/main.go

pkg/connector/errors.go

proto/api/v1/api.pb.go

proto/api/v1/api_grpc.pb.go

https://github.com/conduitio/conduit
Scraping https://github.com/conduitio/conduit...

http://localhost:8080/v1/connectors
Scraping http://localhost:8080/v1/connectors...
HTTP error occurred: [Errno 61] Connection refused
Failed to retrieve content from http://localhost:8080/v1/connectors

http://localhost:8080/openapi/#/`)
Scraping http://localhost:8080/openapi/#/`)...
HTTP error occurred: [Errno 61] Connection refused
Failed to retrieve content from http://localhost:8080/openapi/#/`)

https://github.com/ConduitIO/conduit/blob/main/LICENSE.md
Scraping https://github.com/ConduitIO/conduit/blob/main/LICENSE.md...

http://localhost:8080/v1/pipelines/<pipeline_id>/start
Scraping http://localhost:8080/v1/pipelines/<pipeline_id>/start...
HTTP error occurred: [Errno 61] Connection refused
Failed to retrieve content from
http://localhost:8080/v1/pipelines/<pipeline_id>/start
Initial repo scan can be slow in larger repos, but only happens once.
Your estimated chat context of 215,195 tokens exceeds the 65,536 token limit for
openrouter/deepseek/deepseek-chat!
To reduce the chat context:

  • Use /drop to remove unneeded files from the chat
  • Use /clear to clear the chat history
  • Break your code into smaller files
    It's probably safe to try and send the request, most providers won't charge if
    the context limit is exceeded.
    litellm.BadRequestError: OpenrouterException - {"error":{"message":"This
    endpoint's maximum context length is 163840 tokens. However, you requested about
    188362 tokens (188362 of text input). Please reduce the length of either one, or
    use the context-compression plugin to compress your prompt
    automatically.","code":400,"metadata":{"provider_name":null}},"user_id":"user_3C
    EeCGEy5UD03LA5fyj9Xe8qgTU"}

Generated by conduit-agent-experiment (archivist: Gemini Flash, implementer: openrouter/deepseek/deepseek-chat, 1 iterations).

Fixes ConduitIO#576

Generated by conduit-agent-experiment implementer.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Error codes needs to be documented in Swagger

1 participant