Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[DRAFT][onert-micro] Introduce Huffman Transcoder #12744

Draft
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

SlavikMIPT
Copy link
Contributor

This commit introduces Huffman Transcoder for lossless weight compression.

ONE-DCO-1.0-Signed-off-by: Vyacheslav Bazhenov slavikmipt@gmail.com

@SlavikMIPT SlavikMIPT marked this pull request as draft March 13, 2024 08:31
@SlavikMIPT
Copy link
Contributor Author

Output for this example:

Input string bits:
10240

Encoded string bits:
3319
Bits to store tree:
2559
Compression:
42.5977%

As we can see - using this coding of quantized weights we can significantly compress weights tensor (under the assumption that it contains many identical values)

This commit introduces Huffman Transcoder for lossless weight compression.

ONE-DCO-1.0-Signed-off-by: Vyacheslav Bazhenov <slavikmipt@gmail.com>
Comment on lines +141 to +144
// std::cout << "Huffman Codes are :\n";
// for (auto pair : huffmanCode) {
// std::cout << static_cast<int>(pair.first) << " " << pair.second << '\n';
// }
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think need to be remove, right?)

Comment on lines +150 to +152
// TODO: replace string with bitset or bool vector
// print encoded string
std::string str = "";
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe better rewrite it to bool or bitset in this draft? Since a string can take up a lot of extra memory, and then check memory after that?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants