Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add use cases: semiprime factors ratio detection #121

Open
sashakolpakov opened this issue Oct 31, 2023 · 2 comments · May be fixed by #126
Open

Add use cases: semiprime factors ratio detection #121

sashakolpakov opened this issue Oct 31, 2023 · 2 comments · May be fixed by #126
Labels
audience/technical Issue primarily for technical review and service. kind/documentation Improvements or additions to documentation kind/enhancement New feature or request kind/usability triage/high-priority triage/required

Comments

@sashakolpakov
Copy link
Collaborator

sashakolpakov commented Oct 31, 2023

This test is based on Sam Blake's preprint "Integer Factorisation, Fermat & Machine Learning on a Classical Computer", arXiv:2308.12290

Detecting the ratio of semiprime factors may theoretically help improving the classical Lawrence algorithm for semiprime factorization.

I used Cerebros on Sam Blake's data and got ~3% more false negatives. However, I got ~10% better accuracy. Given that we used only 20% of the dataset for training, and 80% for testing, this result looks good (the dataset has 1e6 128-bit primes).

@sashakolpakov sashakolpakov added kind/documentation Improvements or additions to documentation kind/enhancement New feature or request triage/high-priority triage/required kind/usability audience/technical Issue primarily for technical review and service. labels Oct 31, 2023
@sashakolpakov sashakolpakov changed the title Semiprime factors ratio detection Add use cases: semiprime factors ratio detection Oct 31, 2023
@sashakolpakov sashakolpakov linked a pull request Nov 3, 2023 that will close this issue
@david-thrower
Copy link
Owner

Paired with: Tabular binary classification in the Cerebros UI (We should be able to hyperparam tune this on the UI based system)

@david-thrower
Copy link
Owner

@sashakolpakov, One thought I have is that I just wonder if it is possible to quantize (or z / t / min-max scale) the series to coerce it to 32 bit precision.

I see a few issues that may affect the performance:

  • By default, I think the weights are 32 bit signed floats. I may need to update the Cerebros API to include a kwarg for the data type of the weights, such that we can coerce the weights to match the datatype of the input data.
  • A time series pattern may be identifiable from a quantized / scaled re-frame of the problem. I am curious if z / t / min-max scaling of the training data will improve the trade off in accuracy vs computational expense.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
audience/technical Issue primarily for technical review and service. kind/documentation Improvements or additions to documentation kind/enhancement New feature or request kind/usability triage/high-priority triage/required
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants