-
-
Notifications
You must be signed in to change notification settings - Fork 206
feat(billing): switch asset usage endpoint to limit offset pagination DEV-987 #6234
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
| from kpi.utils.usage_calculator import ServiceUsageCalculator | ||
|
|
||
|
|
||
| class NlpUsageSerializer(serializers.Serializer): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the main difference from how I've seen the drf_spectacular code used in other parts of the app. I thought it would be preferable to use the actual AssetUsageSerializer in views.py, rather than creating an inline serializer. To me, it makes the code more readable. But I can also see that there might be some confusion of concerns, since the NlpUsageSerializer here is being passed into @extend_schema_field rather than used directly in the AssetUsageSerializer. Hoping to get your input on this, @noliveleger.
🗒️ Checklist
<type>(<scope>)<!>: <title> DEV-1234Front endand/orBack endorworkflow📣 Summary
Removes custom asset usage page-size pagination, thereby using KPI's default limit-offset pagination, and adjusts drf_spectacular code for the endpoint to improve API docs/frontend helpers.
💭 Notes
I am marking this as a draft for the time being, but it is ready for review. We will merge it as part of the corresponding frontend changes (See DEV-1013).
👀 Preview steps
No need for testing on this. Testing will come with frontend PR.