<?xml version="1.0" encoding="UTF-8"?><rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>taskq &amp;mdash; Little Step</title>
    <link>https://rex.writeas.com/tag:taskq</link>
    <description></description>
    <pubDate>Sat, 09 May 2026 23:20:10 +0000</pubDate>
    <item>
      <title>A very simple task queue rq</title>
      <link>https://rex.writeas.com/a-very-simple-task-queue-rq?pk_campaign=rss-feed</link>
      <description>&lt;![CDATA[There a lot of tasks in computer that are not appropriate for working synchronizing, such as media transcoding.&#xA;&#xA;For these tasks, the work will be submitted to a central controlled service, and then dispatched to many workers.&#xA;&#xA;Message queue is commonly used in this case for communication during controller and workers, and sometimes even used to implemented task queue.&#xA;&#xA;RQ is a very simple task queue with Redis, and is easy using, you even don&#39;t need to write specific code for workers.&#xA;&#xA;The producer in RQ side enqueues the data and object to Redis queue, the object is serialized with Python&#39;s pickle lib. And the worker retries task from the Redis queue, and deserializes the job and fork one process to do the actual work.&#xA;&#xA;There is a simple example to show the simpliciy for the logic.&#xA;&#xA;You have one function to do the real job, such as:&#xA;&#xA;fib.py&#xA;def slowfib(n):&#xA;    if n &lt;= 1:&#xA;        return 1&#xA;    else:&#xA;        return slowfib(n-1) + slowfib(n-2)&#xA;&#xA;And you create jobs and enqueue:&#xA;&#xA;runexample.py&#xA;import os&#xA;import time&#xA;&#xA;from rq import Connection, Queue&#xA;&#xA;from fib import slowfib&#xA;&#xA;def main():&#xA;    # Range of Fibonacci numbers to compute&#xA;    fibrange = range(20, 34)&#xA;&#xA;    # Kick off the tasks asynchronously&#xA;    asyncresults = {}&#xA;    q = Queue()&#xA;    for x in fibrange:&#xA;        asyncresults[x] = q.enqueue(slowfib, x)&#xA;&#xA;    starttime = time.time()&#xA;    done = False&#xA;    while not done:&#xA;        os.system(&#39;clear&#39;)&#xA;        print(&#39;Asynchronously: (now = %.2f)&#39; % (time.time() - starttime,))&#xA;        done = True&#xA;        for x in fibrange:&#xA;            result = asyncresults[x].returnvalue&#xA;            if result is None:&#xA;                done = False&#xA;                result = &#39;(calculating)&#39;&#xA;            print(&#39;fib(%d) = %s&#39; % (x, result))&#xA;        print(&#39;&#39;)&#xA;        print(&#39;To start the actual in the background, run a worker:&#39;)&#xA;        print(&#39;    python examples/runworker.py&#39;)&#xA;        time.sleep(0.2)&#xA;&#xA;    print(&#39;Done&#39;)&#xA;&#xA;if name == &#39;main&#39;:&#xA;    # Tell RQ what Redis connection to use&#xA;    with Connection():&#xA;        main()&#xA;&#xA;On the producer side, you run like this:&#xA;&#xA;python3 runexample.py&#xA;Asynchronously: (now = 8.04)&#xA;fib(20) = 10946&#xA;fib(21) = 17711&#xA;fib(22) = 28657&#xA;fib(23) = 46368&#xA;fib(24) = 75025&#xA;fib(25) = 121393&#xA;fib(26) = 196418&#xA;fib(27) = 317811&#xA;fib(28) = 514229&#xA;fib(29) = 832040&#xA;fib(30) = 1346269&#xA;fib(31) = 2178309&#xA;fib(32) = 3524578&#xA;fib(33) = 5702887&#xA;&#xA;To start the actual in the background, run a worker:&#xA;    python examples/runworker.py&#xA;Done&#xA;&#xA;On the worker side, you only need this(but make sure this command executed at the same directory of fib.py):&#xA;&#xA;rqworker&#xA;15:36:32 Worker rq:worker:bd9fbdd72217489288bcf6c47e499f9c: started, version 1.11.0&#xA;15:36:32 Subscribing to channel rq:pubsub:bd9fbdd72217489288bcf6c47e499f9c&#xA;15:36:32 *** Listening on default...&#xA;15:36:32 Cleaning registries for queue: default&#xA;15:36:32 default: fib.slowfib(20) (0c5dc1dd-b8b8-4a23-9220-1f4f03781c53)&#xA;15:36:32 default: Job OK (0c5dc1dd-b8b8-4a23-9220-1f4f03781c53)&#xA;15:36:32 Result is kept for 500 seconds&#xA;&#xA;But there are two things we need to pay attention, a) the actually function for the task need to be in a separate file(fib.py in this case), and b)rqworker need to be executed under the same source directory of producer(run\example.py in this case).&#xA;&#xA;#rq #redis #taskq]]&gt;</description>
      <content:encoded><![CDATA[<p>There a lot of tasks in computer that are not appropriate for working synchronizing, such as media transcoding.</p>

<p>For these tasks, the work will be submitted to a central controlled service, and then dispatched to many workers.</p>

<p>Message queue is commonly used in this case for communication during controller and workers, and sometimes even used to implemented task queue.</p>

<p><a href="https://python-rq.org" rel="nofollow">RQ</a> is a very simple task queue with Redis, and is easy using, you even don&#39;t need to write specific code for workers.</p>

<p>The producer in RQ side enqueues the data and object to Redis queue, the object is serialized with Python&#39;s pickle lib. And the worker retries task from the Redis queue, and deserializes the job and fork one process to do the actual work.</p>

<p>There is a simple example to show the simpliciy for the logic.</p>

<p>You have one function to do the real job, such as:</p>

<pre><code class="language-python"># fib.py
def slow_fib(n):
    if n &lt;= 1:
        return 1
    else:
        return slow_fib(n-1) + slow_fib(n-2)
</code></pre>

<p>And you create jobs and enqueue:</p>

<pre><code class="language-python"># run_example.py
import os
import time

from rq import Connection, Queue

from fib import slow_fib


def main():
    # Range of Fibonacci numbers to compute
    fib_range = range(20, 34)

    # Kick off the tasks asynchronously
    async_results = {}
    q = Queue()
    for x in fib_range:
        async_results[x] = q.enqueue(slow_fib, x)

    start_time = time.time()
    done = False
    while not done:
        os.system(&#39;clear&#39;)
        print(&#39;Asynchronously: (now = %.2f)&#39; % (time.time() - start_time,))
        done = True
        for x in fib_range:
            result = async_results[x].return_value
            if result is None:
                done = False
                result = &#39;(calculating)&#39;
            print(&#39;fib(%d) = %s&#39; % (x, result))
        print(&#39;&#39;)
        print(&#39;To start the actual in the background, run a worker:&#39;)
        print(&#39;    python examples/run_worker.py&#39;)
        time.sleep(0.2)

    print(&#39;Done&#39;)


if __name__ == &#39;__main__&#39;:
    # Tell RQ what Redis connection to use
    with Connection():
        main()
</code></pre>

<p>On the producer side, you run like this:</p>

<pre><code class="language-shell">python3 run_example.py
Asynchronously: (now = 8.04)
fib(20) = 10946
fib(21) = 17711
fib(22) = 28657
fib(23) = 46368
fib(24) = 75025
fib(25) = 121393
fib(26) = 196418
fib(27) = 317811
fib(28) = 514229
fib(29) = 832040
fib(30) = 1346269
fib(31) = 2178309
fib(32) = 3524578
fib(33) = 5702887

To start the actual in the background, run a worker:
    python examples/run_worker.py
Done

</code></pre>

<p>On the worker side, you only need this(but make sure this command executed at the same directory of fib.py):</p>

<pre><code class="language-shell">rqworker
15:36:32 Worker rq:worker:bd9fbdd72217489288bcf6c47e499f9c: started, version 1.11.0
15:36:32 Subscribing to channel rq:pubsub:bd9fbdd72217489288bcf6c47e499f9c
15:36:32 *** Listening on default...
15:36:32 Cleaning registries for queue: default
15:36:32 default: fib.slow_fib(20) (0c5dc1dd-b8b8-4a23-9220-1f4f03781c53)
15:36:32 default: Job OK (0c5dc1dd-b8b8-4a23-9220-1f4f03781c53)
15:36:32 Result is kept for 500 seconds
</code></pre>

<p>But there are two things we need to pay attention, a) the actually function for the task need to be in a separate file(fib.py in this case), and b)rqworker need to be executed under the same source directory of producer(run_example.py in this case).</p>

<p><a href="https://rex.writeas.com/tag:rq" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">rq</span></a> <a href="https://rex.writeas.com/tag:redis" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">redis</span></a> <a href="https://rex.writeas.com/tag:taskq" class="hashtag" rel="nofollow"><span>#</span><span class="p-category">taskq</span></a></p>
]]></content:encoded>
      <guid>https://rex.writeas.com/a-very-simple-task-queue-rq</guid>
      <pubDate>Wed, 24 Aug 2022 07:43:14 +0000</pubDate>
    </item>
  </channel>
</rss>