Skip to content

[BUG] Static rate limits reset their own limits to 0 #1834

@jfaust

Description

@jfaust

Describe the issue
I'm unable to get static rate limits to work at all. They seem to reset their limit (not their value) to 0 as soon as they're used.

Environment

  • SDK: Python 1.10.2
  • Engine: Cloud

Expected behavior
I don't expect the limit to change without a call to hatchet.rate_limits.put()

Code to Reproduce, Logs, or Screenshots

from hatchet_sdk import Context, EmptyModel, Hatchet
from hatchet_sdk.rate_limit import RateLimit, RateLimitDuration
from pydantic import BaseModel

hatchet = Hatchet(debug=True)

class Input(BaseModel):
    foo: int = 100

workflow = hatchet.workflow(name="test", input_validator=Input)

@workflow.task(rate_limits=[
    RateLimit(
        static_key="rate_limit",
        units="input.foo"
    )
])
async def task(input: Input, ctx: Context):
    print(input)
    return {}

if __name__ == "__main__":
    hatchet.rate_limits.put("rate_limit", 1000, RateLimitDuration.MINUTE)
    worker = hatchet.worker(name="test", workflows=[workflow])
    workflow.run_no_wait(Input())
    worker.start()

When I run this, the task gets queued:
Image

And if I look at the Rate Limits section, I see:
Image

If I cancel the task and just run:

hatchet.rate_limits.put("rate_limit", 1000, RateLimitDuration.MINUTE)

The rate limit goes back up:
Image

But as soon as I run the worker again, it drop back to 0.

What am I doing wrong here?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions