Fork me on GitHub
#clara
<
2023-09-08
>
Joel17:09:01

Using Accumulators can I use (take … to limit the number of matches? I’m wanting to process a given number of matches multithread on those and then not have to process the matches later. So say 15 matches, I want to process 10 at a time, I want to (take 10) but only get the remaining 5 in another firing of the rule. If I need to let the rules quiesce and do this “outside of the session” that could work as well.

ethanc02:09:46

The take x accumulator is quite doable with clara today, looking something like:

(defn take-accum
  [x]
  (acc/accum
    {:initial-value nil
     :reduce-fn (fn [items value]
                  (if (= (count items) x)
                    items
                    (conj items value)))})) 
However, this doesn’t address: > I want to process 10 at a time, I want to (take 10) but only get the remaining 5 in another firing of the rule. As the accumulator above would be done after accumulating the prerequisite amount of facts. Additional fire-rules wouldn’t propagate new facts after the number was reached. Off the top of my head i don’t know that the latter functionality/ability would be possible within the session itself. As it feels like the remaining 5 would be “unfinished” work for the session, in the sense that fire-rules contract is that after the call, the session should have reached a “steady state”… remaining work would imply a non-stable state.

Joel04:09:19

That’s what I suspected w/o trying it out. I suppose I could do this outside of the session. Fetch all the matches, and then retract the ones I decide to process.

Joel19:09:27

Maybe another option is to pull all of them, process 10, retract 15, then push the 5 back?

ethanc21:09:05

Im always weary of using retract as it feels like re-writing history, probably just a me thing though. As a possible alternative, a partitioning accum:

(defn partitioning-accum
  [x]
  (acc/accum
    {:initial-value nil
     :reduce-fn conj
     :convert-return-fn #(partition-all x %)}))
With perhaps a paging like system:
(r/defrule paging-rule
  [?pages <- (partitioning-accum 3) :from [Long]]
  =>
  (r/insert-all! (map-indexed #(with-meta {:values %2 :page %1} {:type :paged-long}) ?pages)))
Where the pages could be directly queried:
(r/defquery paged-response
  [:?page]
  [:paged-long [{:keys [page values]}] (= page ?page) (= values ?results)])
or gated by processing-requests:
(r/defrule page-processing-rule
  [:process-request [{:keys [page]}] (= ?page page)]
  [:paged-long [{:keys [page values]}] (= page ?page) (= values ?results)]
  =>
  (r/insert! ^{:type :more-processing} {:res ?results}))
Is probably how i’d look at it.

ethanc21:09:21

apologies for the delay here

thanks3 1