{"id":11486,"date":"2024-07-23T16:37:41","date_gmt":"2024-07-23T15:37:41","guid":{"rendered":"https:\/\/www.blopig.com\/blog\/?p=11486"},"modified":"2024-07-23T16:37:43","modified_gmt":"2024-07-23T15:37:43","slug":"easy-python-job-queues-with-rq","status":"publish","type":"post","link":"https:\/\/www.blopig.com\/blog\/2024\/07\/easy-python-job-queues-with-rq\/","title":{"rendered":"Easy Python job queues with RQ"},"content":{"rendered":"\n<p>Job queueing is an important consideration for a web application, especially one that needs to play nice and share resources with other web applications. There are lots of options out there with varying levels of complexity and power, but for a simple pure Python job queue that <em>just works<\/em>, RQ is quick and easy to get up and running.<\/p>\n\n\n\n<p><a href=\"https:\/\/python-rq.org\/\" data-type=\"link\" data-id=\"https:\/\/python-rq.org\/\">RQ<\/a> is a Python job queueing package designed to work out of the box, using a <a href=\"https:\/\/redis.io\/\">Redis<\/a> database as a message broker (the bit that allows the app and workers to exchange information about jobs). To use it, you just need a redis-server installation and the rq module in your python environment. <\/p>\n\n\n\n<!--more-->\n\n\n\n<h2 class=\"wp-block-heading\">Installation<\/h2>\n\n\n\n<p>You can either build redis-server from source, or install it using your package manager. <\/p>\n\n\n\n<p>From source:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"bash\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">wget https:\/\/download.redis.io\/redis-stable.tar.gz\ntar -xzvf redis-stable.tar.gz\ncd redis-stable\nmake<\/pre>\n\n\n\n<p>Or, to install binaries in \/usr\/local\/bin:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"bash\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">sudo make install<\/pre>\n\n\n\n<p>Using apt on Debian or Ubuntu:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"bash\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">curl -fsSL https:\/\/packages.redis.io\/gpg | sudo gpg --dearmor -o \/usr\/share\/keyrings\/redis-archive-keyring.gpg\n\necho \"deb [signed-by=\/usr\/share\/keyrings\/redis-archive-keyring.gpg] https:\/\/packages.redis.io\/deb $(lsb_release -cs) main\" | sudo tee \/etc\/apt\/sources.list.d\/redis.list\n\nsudo apt-get update\nsudo apt-get install redis<\/pre>\n\n\n\n<p>To start redis-server, simply run:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"bash\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">redis-server<\/pre>\n\n\n\n<p>To install rq, just use pip:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"bash\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">pip install rq<\/pre>\n\n\n\n<h2 class=\"wp-block-heading\">Set up a job queue<\/h2>\n\n\n\n<p>To run a job, we just need a function to define the work to be done:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">def do_work():\n    return 'Productive work'<\/pre>\n\n\n\n<p>With redis-server running, we connect to the database and set up a queue. By default, redis-server runs on port 6379, but you can specify a different port either as a command-line option or in the redis config file. You can also have multiple redis instances running on different ports or hosts if you want to split work across multiple connections.<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">from rq import Queue\nfrom redis import Redis\n\nfrom my_module import do_work\n\nconn = Redis('localhost', 6379)\nqueue = Queue(connection=conn)<\/pre>\n\n\n\n<p>We also need some workers listening on the queue. You can start these using RQ&#8217;s command-line interface, or by writing your own worker script. The latter has the advantage of allowing you to pre-import modules required for the jobs the workers will do, which can save a lot of work if imports are a significant proportion of the execution time of your jobs:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">from redis import Redis\nfrom rq import Worker, Queue, Connection\n\nfrom my_module import do_work\n\nconn = Redis('localhost', 6379)\nqueue = Queue(connection=conn)\n\nworker = Worker([queue], connection=redis)\nworker.work()<\/pre>\n\n\n\n<p>You can use your worker script to start as many workers as you need.<\/p>\n\n\n\n<p>With redis-server running and workers listening on our queue, we&#8217;re ready to add some tasks. Once enqueued, we can check the status of our job and handle results and errors as needed. Here&#8217;s a silly toy example:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">from rq import Queue\nfrom redis import Redis\nfrom time import sleep\n\nfrom my_module import do_work\n\nconn = Redis('localhost', 6379)\nqueue = Queue(connection=conn)\n\njob = queue.enqueue(do_work)\n\nwhile True:\n    if job.is_finished:\n        print(job.return_value())\n        break\n    elif job.is_failed():\n        print('A very helpful error message')\n        break\n    else:\n        sleep(5)<\/pre>\n\n\n\n<p>A more practical way of tracking jobs is to use the RQ job registries to check the status of your jobs. For example, <\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">from redis import Redis\nfrom rq import Queue\nfrom rq.job import Job\nfrom redis import Redis\nfrom time import sleep\n\nconn = Redis('localhost', 6379)\nqueue = Queue(connection=conn)\nfinished = queue.finished_job_registry\n\nwhile True:\n    finished = queue.finished_job_registry\n    job_ids = finished_job_registry.get_job_ids()\n\n    if not job_ids:\n        print('Nothing to do, checking again in 60 seconds')\n        sleep(60)\n        continue\n\n    jobs = Job.fetch_many(job_ids, connection=conn)\n    for job in jobs:\n        print(f'Job finished: {job.id} {job.func_name}')\n        print(job.return_value())<\/pre>\n\n\n\n<p>If you need to keep track of specific jobs, you can assign a custom job id on creation and fetch your job at a later date:<\/p>\n\n\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"python\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">job = queue.enqueue(do_work, job_id='my-very-important-job')\nsleep(60)\njob = Job.fetch('my-very-important-job')<\/pre>\n\n\n\n<p>For what is intended to be a simple, low barrier to entry library, there&#8217;s a lot you can do with rq. <em>That said<\/em>, the <a href=\"https:\/\/python-rq.org\/docs\/\">documentation<\/a> is&#8230; <a href=\"https:\/\/github.com\/rq\/rq\/issues\/903\">lacking<\/a>. If there isn&#8217;t an example or pattern in the docs that does what you need, you&#8217;ll probably end up scouring the <a href=\"https:\/\/github.com\/rq\/rq\">source<\/a> to figure out what options are available to you or what an object looks like. If you just need a basic job queue for a python application that works out of the box with minimal dependencies, RQ might just be the tool for you. If, however, you find yourself wanting to do more, or you want to use a different message broker, you may find <a href=\"https:\/\/docs.celeryq.dev\/en\/stable\/index.html\">Celery <\/a>more useful. I&#8217;ll probably write about it in the future, once I find something I can&#8217;t do with RQ.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Job queueing is an important consideration for a web application, especially one that needs to play nice and share resources with other web applications. There are lots of options out there with varying levels of complexity and power, but for a simple pure Python job queue that just works, RQ is quick and easy to [&hellip;]<\/p>\n","protected":false},"author":47,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"nf_dc_page":"","wikipediapreview_detectlinks":true,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"ngg_post_thumbnail":0,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[14,227],"tags":[152,781,782],"ppma_author":[498],"class_list":["post-11486","post","type-post","status-publish","format-standard","hentry","category-howto","category-python-code","tag-python","tag-redis","tag-rq"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"authors":[{"term_id":498,"user_id":47,"is_guest":0,"slug":"fergus","display_name":"Fergus Boyles","avatar_url":"https:\/\/secure.gravatar.com\/avatar\/ba8c419ba77128aad589b66ba7ee13da74f4ce2d3108fd724ddcefa200b51c7b?s=96&d=mm&r=g","0":null,"1":"","2":"","3":"","4":"","5":"","6":"","7":"","8":""}],"_links":{"self":[{"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/posts\/11486","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/users\/47"}],"replies":[{"embeddable":true,"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/comments?post=11486"}],"version-history":[{"count":4,"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/posts\/11486\/revisions"}],"predecessor-version":[{"id":11490,"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/posts\/11486\/revisions\/11490"}],"wp:attachment":[{"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/media?parent=11486"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/categories?post=11486"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/tags?post=11486"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/www.blopig.com\/blog\/wp-json\/wp\/v2\/ppma_author?post=11486"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}