-
-
Notifications
You must be signed in to change notification settings - Fork 212
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Searching for job ID fails on Heroku due to 30s request limit #1618
Comments
Because there are no indices for full-text search, searching for a job ID takes an inordinate amount of time. This commit examines the search query and if the only item is a UUID, it specifically searches for a job matching that ID. Fixes bensheldon#1618
On our production cluster, we preserve job records for 4 days only. That allows us to cover Saturday, Sunday and Monday, while we come back on Tuesday after a 3-day weekend. Search is a really interesting feature, but unfortunately, it isn't quite useable at the moment. |
Ick,
I was curious to see how large the index would be. Original database size:
After adding the index, but without labels:
We don't use labels yet. |
We are on 4.9.3 and have run
good_job:update
anddb:migrate
.On the dashboard, searching for a job ID fails on Heroku, because our 120k rows
good_jobs
table takes a very long time to actually complete. Of course, it is possible to open any job, then hack the URL to replace it with the job ID that we have on hand. This works, and takes less than 30s, but is error-prone.The actual query that runs is:
In our database, this means converting 125,000 rows, calling
to_tsvector()
on 9 columns, and converting JSONB, UUID and TEXT columns back to TEXT. That's almost a millionto_tsvector
calls!There are two ways forward:
I'm worried that adding an index may negatively impact performance. We can't forget that GoodJob's database is hosted within another one, and if GoodJob takes more CPU/RAM/disk than the customer's database, then GoodJob would be considered a bad citizen.
The text was updated successfully, but these errors were encountered: