Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

System.TypeInitializationException: 'The type initializer for 'LLama.Native.NativeApi' threw an exception.' #686

Open
altxxr0 opened this issue Apr 23, 2024 · 12 comments

Comments

@altxxr0
Copy link

altxxr0 commented Apr 23, 2024

image

Program.cs

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using LLama.Common;
using LLama;

namespace Relate
{
    class Program
    {
        static async Task Main(string[] args)
        {

            string modelPath = @"C:\Users\jpfau\source\repos\Relate\Relate\phi-2.Q4_K_M.gguf"; // change it to your own model path.

            var parameters = new ModelParams(modelPath)
            {
                ContextSize = 1024, // The longest length of chat as memory.
                GpuLayerCount = 5 // How many layers to offload to GPU. Please adjust it according to your GPU memory.
            };
            using var model = LLamaWeights.LoadFromFile(parameters);
            using var context = model.CreateContext(parameters);
            var executor = new InteractiveExecutor(context);

            // Add chat histories as prompt to tell AI how to act.
            var chatHistory = new ChatHistory();
            chatHistory.AddMessage(AuthorRole.System, "Transcript of a dialog, where the User interacts with an Assistant named Bob. Bob is helpful, kind, honest, good at writing, and never fails to answer the User's requests immediately and with precision.");
            chatHistory.AddMessage(AuthorRole.User, "Hello, Bob.");
            chatHistory.AddMessage(AuthorRole.Assistant, "Hello. How may I help you today?");

            ChatSession session = new(executor, chatHistory);

            InferenceParams inferenceParams = new InferenceParams()
            {
                MaxTokens = 256, // No more than 256 tokens should appear in answer. Remove it if antiprompt is enough for control.
                AntiPrompts = new List<string> { "User:" } // Stop generation once antiprompts appear.
            };

            Console.ForegroundColor = ConsoleColor.Yellow;
            Console.Write("The chat session has started.\nUser: ");
            Console.ForegroundColor = ConsoleColor.Green;
            string userInput = Console.ReadLine() ?? "";

            while (userInput != "exit")
            {
                await foreach ( // Generate the response streamingly.
                    var text
                    in session.ChatAsync(
                        new ChatHistory.Message(AuthorRole.User, userInput),
                        inferenceParams))
                {
                    Console.ForegroundColor = ConsoleColor.White;
                    Console.Write(text);
                }
                Console.ForegroundColor = ConsoleColor.Green;
                userInput = Console.ReadLine() ?? "";
            }
        }
    }
}

Project File

<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="15.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
  <Import Project="..\packages\LLamaSharp.Backend.OpenCL.0.11.2\build\netstandard2.0\LLamaSharp.Backend.OpenCL.props" Condition="Exists('..\packages\LLamaSharp.Backend.OpenCL.0.11.2\build\netstandard2.0\LLamaSharp.Backend.OpenCL.props')" />
  <Import Project="..\packages\LLamaSharp.Backend.Cuda11.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cuda11.props" Condition="Exists('..\packages\LLamaSharp.Backend.Cuda11.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cuda11.props')" />
  <Import Project="..\packages\LLamaSharp.Backend.Cuda12.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cuda12.props" Condition="Exists('..\packages\LLamaSharp.Backend.Cuda12.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cuda12.props')" />
  <Import Project="..\packages\LLamaSharp.Backend.Cpu.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cpu.props" Condition="Exists('..\packages\LLamaSharp.Backend.Cpu.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cpu.props')" />
  <Import Project="$(MSBuildExtensionsPath)\$(MSBuildToolsVersion)\Microsoft.Common.props" Condition="Exists('$(MSBuildExtensionsPath)\$(MSBuildToolsVersion)\Microsoft.Common.props')" />
	<PropertyGroup>
    <Configuration Condition=" '$(Configuration)' == '' ">Debug</Configuration>
    <Platform Condition=" '$(Platform)' == '' ">AnyCPU</Platform>
    <ProjectGuid>{9B8CE030-AE88-4A36-9005-09893E00A571}</ProjectGuid>
    <OutputType>Exe</OutputType>
    <RootNamespace>Relate</RootNamespace>
    <AssemblyName>Relate</AssemblyName>
	<LangVersion>9.0</LangVersion>
	<TargetFrameworkVersion>v4.7.2</TargetFrameworkVersion>
    <FileAlignment>512</FileAlignment>
    <AutoGenerateBindingRedirects>true</AutoGenerateBindingRedirects>
    <Deterministic>true</Deterministic>
    <NuGetPackageImportStamp>
    </NuGetPackageImportStamp>
  </PropertyGroup>
  <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Debug|AnyCPU' ">
    <PlatformTarget>AnyCPU</PlatformTarget>
    <DebugSymbols>true</DebugSymbols>
    <DebugType>full</DebugType>
    <Optimize>false</Optimize>
    <OutputPath>bin\Debug\</OutputPath>
    <DefineConstants>DEBUG;TRACE</DefineConstants>
    <ErrorReport>prompt</ErrorReport>
    <WarningLevel>4</WarningLevel>
  </PropertyGroup>
  <PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">
    <PlatformTarget>AnyCPU</PlatformTarget>
    <DebugType>pdbonly</DebugType>
    <Optimize>true</Optimize>
    <OutputPath>bin\Release\</OutputPath>
    <DefineConstants>TRACE</DefineConstants>
    <ErrorReport>prompt</ErrorReport>
    <WarningLevel>4</WarningLevel>
  </PropertyGroup>
  <ItemGroup>
    <Reference Include="LLamaSharp, Version=0.0.0.0, Culture=neutral, processorArchitecture=MSIL">
      <HintPath>..\packages\LLamaSharp.0.11.2\lib\netstandard2.0\LLamaSharp.dll</HintPath>
    </Reference>
    <Reference Include="Microsoft.Bcl.AsyncInterfaces, Version=8.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51, processorArchitecture=MSIL">
      <HintPath>..\packages\Microsoft.Bcl.AsyncInterfaces.8.0.0\lib\net462\Microsoft.Bcl.AsyncInterfaces.dll</HintPath>
    </Reference>
    <Reference Include="Microsoft.Extensions.DependencyInjection.Abstractions, Version=8.0.0.1, Culture=neutral, PublicKeyToken=adb9793829ddae60, processorArchitecture=MSIL">
      <HintPath>..\packages\Microsoft.Extensions.DependencyInjection.Abstractions.8.0.1\lib\net462\Microsoft.Extensions.DependencyInjection.Abstractions.dll</HintPath>
    </Reference>
    <Reference Include="Microsoft.Extensions.Logging.Abstractions, Version=8.0.0.1, Culture=neutral, PublicKeyToken=adb9793829ddae60, processorArchitecture=MSIL">
      <HintPath>..\packages\Microsoft.Extensions.Logging.Abstractions.8.0.1\lib\net462\Microsoft.Extensions.Logging.Abstractions.dll</HintPath>
    </Reference>
    <Reference Include="System" />
    <Reference Include="System.Buffers, Version=4.0.3.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51, processorArchitecture=MSIL">
      <HintPath>..\packages\System.Buffers.4.5.1\lib\net461\System.Buffers.dll</HintPath>
    </Reference>
    <Reference Include="System.Core" />
    <Reference Include="System.Linq.Async, Version=6.0.0.0, Culture=neutral, PublicKeyToken=94bc3704cddfc263, processorArchitecture=MSIL">
      <HintPath>..\packages\System.Linq.Async.6.0.1\lib\netstandard2.0\System.Linq.Async.dll</HintPath>
    </Reference>
    <Reference Include="System.Memory, Version=4.0.1.2, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51, processorArchitecture=MSIL">
      <HintPath>..\packages\System.Memory.4.5.5\lib\net461\System.Memory.dll</HintPath>
    </Reference>
    <Reference Include="System.Numerics" />
    <Reference Include="System.Numerics.Vectors, Version=4.1.4.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
      <HintPath>..\packages\System.Numerics.Vectors.4.5.0\lib\net46\System.Numerics.Vectors.dll</HintPath>
    </Reference>
    <Reference Include="System.Runtime.CompilerServices.Unsafe, Version=6.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a, processorArchitecture=MSIL">
      <HintPath>..\packages\System.Runtime.CompilerServices.Unsafe.6.0.0\lib\net461\System.Runtime.CompilerServices.Unsafe.dll</HintPath>
    </Reference>
    <Reference Include="System.Text.Encodings.Web, Version=8.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51, processorArchitecture=MSIL">
      <HintPath>..\packages\System.Text.Encodings.Web.8.0.0\lib\net462\System.Text.Encodings.Web.dll</HintPath>
    </Reference>
    <Reference Include="System.Text.Json, Version=8.0.0.3, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51, processorArchitecture=MSIL">
      <HintPath>..\packages\System.Text.Json.8.0.3\lib\net462\System.Text.Json.dll</HintPath>
    </Reference>
    <Reference Include="System.Threading.Tasks.Extensions, Version=4.2.0.1, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51, processorArchitecture=MSIL">
      <HintPath>..\packages\System.Threading.Tasks.Extensions.4.5.4\lib\net461\System.Threading.Tasks.Extensions.dll</HintPath>
    </Reference>
    <Reference Include="System.ValueTuple, Version=4.0.3.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51, processorArchitecture=MSIL">
      <HintPath>..\packages\System.ValueTuple.4.5.0\lib\net47\System.ValueTuple.dll</HintPath>
    </Reference>
    <Reference Include="System.Xml.Linq" />
    <Reference Include="System.Data.DataSetExtensions" />
    <Reference Include="Microsoft.CSharp" />
    <Reference Include="System.Data" />
    <Reference Include="System.Net.Http" />
    <Reference Include="System.Xml" />
  </ItemGroup>
  <ItemGroup>
    <Compile Include="Program.cs" />
    <Compile Include="Properties\AssemblyInfo.cs" />
  </ItemGroup>
  <ItemGroup>
    <None Include="App.config" />
    <None Include="packages.config" />
  </ItemGroup>
  <Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
  <Target Name="EnsureNuGetPackageBuildImports" BeforeTargets="PrepareForBuild">
    <PropertyGroup>
      <ErrorText>This project references NuGet package(s) that are missing on this computer. Use NuGet Package Restore to download them.  For more information, see http://go.microsoft.com/fwlink/?LinkID=322105. The missing file is {0}.</ErrorText>
    </PropertyGroup>
    <Error Condition="!Exists('..\packages\LLamaSharp.Backend.Cpu.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cpu.props')" Text="$([System.String]::Format('$(ErrorText)', '..\packages\LLamaSharp.Backend.Cpu.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cpu.props'))" />
    <Error Condition="!Exists('..\packages\LLamaSharp.Backend.Cuda12.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cuda12.props')" Text="$([System.String]::Format('$(ErrorText)', '..\packages\LLamaSharp.Backend.Cuda12.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cuda12.props'))" />
    <Error Condition="!Exists('..\packages\LLamaSharp.Backend.Cuda11.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cuda11.props')" Text="$([System.String]::Format('$(ErrorText)', '..\packages\LLamaSharp.Backend.Cuda11.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cuda11.props'))" />
    <Error Condition="!Exists('..\packages\LLamaSharp.Backend.OpenCL.0.11.2\build\netstandard2.0\LLamaSharp.Backend.OpenCL.props')" Text="$([System.String]::Format('$(ErrorText)', '..\packages\LLamaSharp.Backend.OpenCL.0.11.2\build\netstandard2.0\LLamaSharp.Backend.OpenCL.props'))" />
  </Target>
</Project>

Exeception Details

System.TypeInitializationException
  HResult=0x80131534
  Message=The type initializer for 'LLama.Native.NativeApi' threw an exception.
  Source=LLamaSharp
  StackTrace:
   at LLama.Native.NativeApi.llama_max_devices()
   at LLama.Abstractions.TensorSplitsCollection..ctor()
   at LLama.Common.ModelParams..ctor(String modelPath)
   at Relate.Program.<Main>d__0.MoveNext() in C:\Users\jpfau\source\repos\Relate\Relate\Program.cs:line 19

  This exception was originally thrown at this call stack:
    [External Code]

Inner Exception 1:
RuntimeError: The native library cannot be correctly loaded. It could be one of the following reasons: 
1. No LLamaSharp backend was installed. Please search LLamaSharp.Backend and install one of them. 
2. You are using a device with only CPU but installed cuda backend. Please install cpu backend instead. 
3. One of the dependency of the native library is missed. Please use `ldd` on linux, `dumpbin` on windows and `otool`to check if all the dependency of the native library is satisfied. Generally you could find the libraries under your output folder.
4. Try to compile llama.cpp yourself to generate a libllama library, then use `LLama.Native.NativeLibraryConfig.WithLibrary` to specify it at the very beginning of your code. For more informations about compilation, please refer to LLamaSharp repo on github.

@AsakusaRinne
Copy link
Collaborator

There's an error text in your project file, as below.

  <Target Name="EnsureNuGetPackageBuildImports" BeforeTargets="PrepareForBuild">
    <PropertyGroup>
      <ErrorText>This project references NuGet package(s) that are missing on this computer. Use NuGet Package Restore to download them.  For more information, see http://go.microsoft.com/fwlink/?LinkID=322105. The missing file is {0}.</ErrorText>
    </PropertyGroup>
    <Error Condition="!Exists('..\packages\LLamaSharp.Backend.Cpu.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cpu.props')" Text="$([System.String]::Format('$(ErrorText)', '..\packages\LLamaSharp.Backend.Cpu.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cpu.props'))" />
    <Error Condition="!Exists('..\packages\LLamaSharp.Backend.Cuda12.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cuda12.props')" Text="$([System.String]::Format('$(ErrorText)', '..\packages\LLamaSharp.Backend.Cuda12.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cuda12.props'))" />
    <Error Condition="!Exists('..\packages\LLamaSharp.Backend.Cuda11.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cuda11.props')" Text="$([System.String]::Format('$(ErrorText)', '..\packages\LLamaSharp.Backend.Cuda11.0.11.2\build\netstandard2.0\LLamaSharp.Backend.Cuda11.props'))" />
    <Error Condition="!Exists('..\packages\LLamaSharp.Backend.OpenCL.0.11.2\build\netstandard2.0\LLamaSharp.Backend.OpenCL.props')" Text="$([System.String]::Format('$(ErrorText)', '..\packages\LLamaSharp.Backend.OpenCL.0.11.2\build\netstandard2.0\LLamaSharp.Backend.OpenCL.props'))" />
  </Target>

Have you installed any backend packages of LLamaSharp? As you can see here.

@altxxr0
Copy link
Author

altxxr0 commented Apr 23, 2024

Yes, Every backend is installed, because without most of the backends it would error that you din''t install the correct backend
image

Is the current released Nuget Packages not functioning?

@AsakusaRinne
Copy link
Collaborator

Could you please take a look at your output folder? For example, bin/debug/net6/. If there's a folder named runtimes, please check if there are dll files named llama.dll in its subfolders.

@altxxr0
Copy link
Author

altxxr0 commented Apr 24, 2024

C:\Users(Username)\source\repos\Relate\Relate\bin\Debug
image
Directory of C:\Users(Username)\source\repos\Relate\Relate\bin\Debug

23/04/2024 04:15 pm

.
23/04/2024 04:15 pm ..
23/04/2024 04:15 pm linux-x64
06/04/2024 02:28 pm 175,104 LLamaSharp.dll
06/04/2024 02:28 pm 260,486 LLamaSharp.xml
31/10/2023 11:00 pm 26,904 Microsoft.Bcl.AsyncInterfaces.dll
31/10/2023 11:00 pm 31,280 Microsoft.Bcl.AsyncInterfaces.xml
15/02/2024 07:56 am 64,160 Microsoft.Extensions.DependencyInjection.Abstractions.dll
15/02/2024 07:56 am 203,027 Microsoft.Extensions.DependencyInjection.Abstractions.xml
15/02/2024 07:56 am 67,848 Microsoft.Extensions.Logging.Abstractions.dll
15/02/2024 07:56 am 91,749 Microsoft.Extensions.Logging.Abstractions.xml
23/04/2024 04:15 pm osx-arm64
23/04/2024 04:15 pm osx-x64
24/04/2024 07:31 am 9,216 Relate.exe
23/04/2024 03:42 pm 1,561 Relate.exe.config
24/04/2024 07:31 am 22,016 Relate.pdb
19/02/2020 06:05 pm 20,856 System.Buffers.dll
19/02/2020 06:05 pm 3,481 System.Buffers.xml
01/02/2022 11:33 pm 1,115,792 System.Linq.Async.dll
01/02/2022 11:27 pm 461,377 System.Linq.Async.xml
08/05/2022 11:31 am 142,240 System.Memory.dll
08/05/2022 11:31 am 13,950 System.Memory.xml
15/05/2018 09:29 pm 115,856 System.Numerics.Vectors.dll
15/05/2018 09:29 pm 183,484 System.Numerics.Vectors.xml
23/10/2021 07:40 am 18,024 System.Runtime.CompilerServices.Unsafe.dll
19/10/2021 03:14 pm 20,529 System.Runtime.CompilerServices.Unsafe.xml
31/10/2023 11:00 pm 79,024 System.Text.Encodings.Web.dll
19/09/2023 07:26 am 63,180 System.Text.Encodings.Web.xml
15/02/2024 08:02 am 643,744 System.Text.Json.dll
19/09/2023 07:26 am 537,044 System.Text.Json.xml
19/02/2020 06:05 pm 25,984 System.Threading.Tasks.Extensions.dll
19/02/2020 06:05 pm 10,147 System.Threading.Tasks.Extensions.xml
15/05/2018 09:29 pm 25,232 System.ValueTuple.dll
15/05/2018 09:29 pm 142 System.ValueTuple.xml
23/04/2024 04:15 pm win-x64

image

@altxxr0
Copy link
Author

altxxr0 commented Apr 24, 2024

No There is no runtime folder only the native library of llama.dll

@AsakusaRinne
Copy link
Collaborator

Please add the following code to the very beginning of your program first.

NativeLibraryConfig.Instance.WithLibrary("<path>");

<path> is the path to a llama.dll, which you can choose one from the dlls shown above. If you don't know which to choose, please download this llama.dll, and this llava_shared.dll if you want to use LLaVA. This might be a quick fix for your problem.

However, that's a bit weird to have llama.dll but without the runtimes folder, might be a BUG here. Could you please tell me more information to further dig on this issue?

  1. dotnet runtime version
  2. project file (.csproj)
  3. your operating system info, including windows version, cpu, etc.

@altxxr0
Copy link
Author

altxxr0 commented Apr 24, 2024

Yeah, I'l tests it.

@altxxr0
Copy link
Author

altxxr0 commented Apr 24, 2024

image
.NET Runtime is 8.0.204
image

@altxxr0
Copy link
Author

altxxr0 commented Apr 24, 2024

Also

Severity	Code	Description	Project	File	Line	Suppression State
Error	CS1061	'NativeLibraryConfig' does not contain a definition for 'WithLibrary' and no accessible extension method 'WithLibrary' accepting a first argument of type 'NativeLibraryConfig' could be found (are you missing a using directive or an assembly reference?)	Relate	C:\Users\jpfau\source\repos\Relate\Relate\Program.cs	16	Active

image

@AsakusaRinne
Copy link
Collaborator

Please use WithLibrary(filename, null) if you don't need to use the LLaVA model.

@hswlab
Copy link
Contributor

hswlab commented May 7, 2024

It seems, that the llava_shared,dll library only exists in the avx folders of deps.zip
Is it currently only possible to use llava for avx backends?

image

@SignalRT
Copy link
Collaborator

SignalRT commented May 9, 2024

It should be also in the root folder:

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants