A link shortening service.
Key features:
- Redirects are fast (Redis cache, async processing)
- Click statistics (total clicks, last click time)
- Top 5 most popular links
- Codes are pre-generated, so creating a link takes microseconds
make startThe service will start at http://localhost:9898. API documentation will be available at /docs (standard FastAPI Swagger).
- FastAPI — main framework
- PostgreSQL — stores links and statistics
- Redis — cache for fast redirects + pool of pre-generated codes
- RabbitMQ — queue for click processing (so redirects aren’t blocked)
- TaskIQ — background tasks (refilling code pool, batch processing clicks)
All details are in /docs after launch.
- You create a link → get an 8-character code
- Someone clicks → instant redirect (from cache), click is logged to queue
- Background task collects clicks in batches and updates statistics
- Codes are pre-generated in a pool, so you don’t wait during creation
make up— startmake down— stopmake logs— view logsmake migrate— run migrationsmake shell— open IPython shell
- Code pool refills automatically (when less than 10% remains)
- Clicks are processed in batches every minute
- Link cache lives for 24 hours
- If the code pool is empty — codes are generated synchronously (but this is rare)
This project uses DockerSlim to minimize image size.
Image sizes:
- Original image: ~667 MB
- Minified image: ~352 MB (1.9x smaller)
Automatic minification:
-
Auto on start (recommended):
make start-slim # checks if minification is needed and runs it automatically -
Using an environment variable:
SLIM_BUILD=true make build # automatically minifies during build -
Manually:
make slim-build # minify the image make slim-start # minify and run with slim image make auto-slim # smart minification (only if needed)
Note: DockerSlim works only locally. In CI/CD, dev images are not minified.
How it works:
auto-slimchecks hash ofDockerfileandpyproject.toml— only minifies if they changedstart-slimautomatically minifies and starts with slim image- Volumes (
./src:/opt/app) still work for hot-reload
- Delete expired URLs (cleanup task for expired links)
- Store clicks in ClickHouse with metadata (IP, user-agent, timestamp, etc.)
- Investigate RPS limits and optimize performance
- Optimize Docker image size (DockerSlim)
- QR code generation for links
- CI/CD pipeline setup (GitHub Actions)
- Add tests (unit + integration)
- User authentication, rate limits per user, personal links