I'm trying to write a web service that takes requests, puts them into a queue, and then processes them in batches of 2. The response can be sent straight away, and I'm trying to use Lamina as follows (though not sure it's the right choice)...

(def ch (channel))

(def loop-forever
  (comp doall repeatedly))

(defn consumer []
  (loop-forever
    (fn []
      (process-batch
        @(read-channel ch)
        @(read-channel ch)))))

(def handler [req]
  (enqueue ch req)
  {:status 200
   :body "ok"})

But this doesn't work... :( I've been through all the Lamina docs but can't get my head around how to use these channels. Can anyone confirm if Lamina supports this kind of behaviour and advise on a possible solution?

有帮助吗?

解决方案

The point of lamina is that you don't want to loop forever: you want lamina's scheduler to use a thread from its pool to do work for you whenever you have enough data to do work on. So instead of using the very, very low-level read-channel function, use receive to register a callback once, or (more often) receive-all to register a callback for every time a channel receives data. For example:

(def ch (lamina/channel))

(lamina/receive-all (lamina/partition* 2 channel)
                    (partial apply process-batch))

(defn handler [req]
  (lamina/enqueue ch req)
  {:status 200
   :body "ok"})
许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top