You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I am trying to figure out if gleam is the right tool for the job.
My inquiry is similar to #27 , I have a huge amount of data that I want to keep in memory to run a certain operation on it several times in a distributed fashion.
For the sake of clarity, I am reporting a simplified pseudo-code example of my use case.
data = seq(1, 1024*3)
mapping = (n) -> n*56
filter_op = (n) -> n < 123456
My goal is to have 3 nodes, each with 1024 numbers. Each node would run mapping, discarding results that do not satisfy the filter_op predicate.
I would like to be able to run this multiple times, changing mapping or filter_op but keeping the 1024 numbers in the nodes' memory.
Thank you in advance for your time and help!
The text was updated successfully, but these errors were encountered:
Hello, I am trying to figure out if gleam is the right tool for the job.
My inquiry is similar to #27 , I have a huge amount of data that I want to keep in memory to run a certain operation on it several times in a distributed fashion.
For the sake of clarity, I am reporting a simplified pseudo-code example of my use case.
My goal is to have 3 nodes, each with
1024
numbers. Each node would runmapping
, discarding results that do not satisfy thefilter_op
predicate.I would like to be able to run this multiple times, changing
mapping
orfilter_op
but keeping the1024
numbers in the nodes' memory.Thank you in advance for your time and help!
The text was updated successfully, but these errors were encountered: