We're looking to speed up processing time of our web application by setting up a Redis Queue to handle a couple of time-consuming tasks asynchronously.
Most importantly, we need the RQ worker to pre-load certain large .bin files and keep them stored in memory at all times, for example in a global variable. This is what consumes the most time - loading these large files. So we would like to eliminate that.
Here is a summary of our infrastructure:
- dedicated server on 1&1 IONOS, running Debian 9 (4cpu/32GB/1TB)
- server is set up as a single-node Docker Swarm, using Portainer
- app is in Flask
- backend/API code is in Python 2.7
- frontend code is in HTML/CSS/JS (vanilla/jQuery)
- GitLab and GitLab Runner for CI/CD
- external MongoDB database
- Redis server installed
Here is a potentially helpful link: [login to view URL]
Ready to start working on this! I'm an expert in this kind of issues. My linkedin reference: [login to view URL] Could you please describe the flow of the time-consuming task?
Hi, Greetings!! We have huge experience of working in Python. Please chat with us so that we can discuss further Looking forward to your response Thanks & Regards, Suhasini