You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
the former gives important context for the problem and addresses the various technical challenges and our solutions to them. the latter gives a practical guide for using the system, including some discussion of its performance.
right now, the compression features are only running successfully on machines with greater than 64GB of RAM (128GB works, 64 doesn't: the reported "maximum resident set size" is 58GB), and this is when k = 1. The memory usage and SRS size scale piecewise-constant and asymptotically linearly in k: they double whenever the total nova circuit size, measured in terms of the number of non-zero entries in the r1cs matrices, crosses a power of 2. these numbers should improve with further development, by an expected factor of ~4.
i'm not sure what else we want to explain or where it should live.
We need a full description of the Nexus Proof Compression sequence:
The text was updated successfully, but these errors were encountered: