(WIP) Draft of manual mode (static) and decorators (dynamic)#3
(WIP) Draft of manual mode (static) and decorators (dynamic)#3ivasio wants to merge 3 commits intoQratorLabs:masterfrom
Conversation
|
|
||
|
|
||
| def get_requester_id(requester_id_getter, args, kwargs): | ||
| if type(requester_id_getter) == 'str': |
There was a problem hiding this comment.
isinstance(requester_id_getter, str)|
|
||
| def setup(zones): | ||
| if limiter is None: | ||
| limiter = RequestLimiter(zones) |
There was a problem hiding this comment.
Singletone limiter object for entire app. Yes, it can be initialized explicitly, but this function basically helps to avoid double initialization
There was a problem hiding this comment.
Well, this wouldn't work because limiter is a local variable here. You can force it to work with global keyword but why do you need that singleton?
| "argument") | ||
| except TypeError: | ||
| raise TypeError(f"Value passed in {requester_id_getter} " | ||
| "must be hashable") |
There was a problem hiding this comment.
isinstance(requeser_id, typing.Hashable)|
|
||
|
|
||
| def limit_calls(requester_id_getter, zone, burst=0, delay=float('inf')): | ||
| def decorator(function): |
|
|
||
| class Zone: | ||
|
|
||
| def __init__(self, size, rate, loop=asyncio.get_event_loop()): |
There was a problem hiding this comment.
Avoid keeping a loop if you can. And don't create it this way.
| delay = limiter.get_request_delay(requester_id, resource) | ||
| if delay > 0: | ||
| asyncio.sleep(delay) | ||
| return function(*args, **kwargs) |
There was a problem hiding this comment.
so the original function should be awaitable (and probably we need to check it with typing.Awaitable)
There was a problem hiding this comment.
and we need to await here of course
There was a problem hiding this comment.
You mean awaiting asyncio.sleep, don't you?
There was a problem hiding this comment.
we need to await both sleep and function calls
There was a problem hiding this comment.
Unfortunately, typing.Awaitable check works on coroutine objects, which was actually expected. We need to check if it was a coroutine fucntion before invoking it, so I decided to heck it with asyncio.iscoroutinefunction. It doesn't work with Task's, for instance, so if it is needed, we may consider searching for a workaround
aioleakybucket/test_static.py
Outdated
| } | ||
|
|
||
|
|
||
|
|
There was a problem hiding this comment.
please add pylint call as commit hook (or just don't forget to call it or integrate it with your IDE.
aioleakybucket/test_static.py
Outdated
| request_times = [1, 1, 1, 1, 2, 2, 2, 2, 3, 3, 3, 3] | ||
| try_requests( | ||
| limiter, [(time, 0, '/login') for time in request_times] | ||
| ) No newline at end of file |
There was a problem hiding this comment.
please keep newlines in the end of the files
aioleakybucket/test_static.py
Outdated
|
|
||
|
|
||
| def try_requests(limiter, requests): | ||
| for i, request in enumerate(requests, 1): |
There was a problem hiding this comment.
for static version we need to operate on list of tuples (or pandas.DataFrame and have list or dataframe with results back)
aioleakybucket/test_static.py
Outdated
| limiter = static.RequestLimiter(zones, resources) | ||
| request_times = [1, 1, 1, 1, 2, 2, 2, 2, 3, 3, 3, 3] | ||
| try_requests( | ||
| limiter, [(time, 0, '/login') for time in request_times] |
There was a problem hiding this comment.
it is crucial to have different requester_id's and different targets.
There was a problem hiding this comment.
Of course it will be covered with tests, but actually I want to illustrate the correctness of the algorithm itself here, and the working tool is just this algorithm replicated for many resources and clients. Or did you mean anything else by crucial here?
Manual mode can be tested by running test_static.py