Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deploying on virtual machines? #1106

Open
fidecastro opened this issue Jan 2, 2024 · 1 comment
Open

Deploying on virtual machines? #1106

fidecastro opened this issue Jan 2, 2024 · 1 comment
Assignees

Comments

@fidecastro
Copy link

Anybody knows if there are issues running this on Virtual Machines?

I'm testing this on a homelab with Proxmox running on relatively old equipment (dual Xeon E5-2690v4), and I can't get it to work. I'm running a Ubuntu 22.04-server virtual machine and have successfully run llama.cpp in it, but I keep getting "illegal instruction" errors when trying to use this repository.

@zhewang1-intc
Copy link
Collaborator

zhewang1-intc commented Jan 3, 2024

hi, you can use gdb to find out the crash position in this repo, pls refer to this issue #944.

once you get the crash position, e.g. intel-extension-for-transformers/intel_extension_for_transformers/llm/library/jblas/jblas/jit_blas_wrapper.h:152, you can set a breakpoint here via
b intel-extension-for-transformers/intel_extension_for_transformers/llm/library/jblas/jblas/jit_blas_wrapper.h:152.
after program stop at this breakpoint, you can use layout asm to switch to the assembly view and execute stepi until program throw a "illegal instruction" error.
If you get the instruction which lead program crash on vm, pls contact us and we will take a look.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants