We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is it possible? I'd like something similar to this, which we can do when using a ChatSession:
ChatSession
await foreach (var text in session.ChatAsync(message, infParams)) { Console.Write(text); }
This is my code for a document ingestion chat bot, but I have to wait for the whole answer to complete before the text is visible.
using LLama.Common; using LLamaSharp.KernelMemory; using Microsoft.KernelMemory; using Microsoft.KernelMemory.Configuration; namespace LlamaChat; public static class DocumentBot { public static async Task Start() { // Setup the kernel memory with the LLM model. Console.ForegroundColor = ConsoleColor.DarkGray; var modelPath = @"C:\dev\ai-models\llama-2-7b-chat.Q3_K_M.gguf"; var infParams = new InferenceParams { AntiPrompts = ["\n\n"] }; var lsConfig = new LLamaSharpConfig(modelPath) { DefaultInferenceParams = infParams }; var searchClientConfig = new SearchClientConfig { MaxMatchesCount = 1, AnswerTokens = 100 }; var parseOptions = new TextPartitioningOptions { MaxTokensPerParagraph = 300, MaxTokensPerLine = 100, OverlappingTokens = 30 }; IKernelMemory memory = new KernelMemoryBuilder() .WithLLamaSharpDefaults(lsConfig) .WithSearchClientConfig(searchClientConfig) .With(parseOptions) .Build(); // Ingest documents (format is automatically detected from the filename). var documentFolder = @"C:\dev\LlamaChat\docs"; string[] documentPaths = Directory.GetFiles(documentFolder, "*.*"); for (int i = 0; i < documentPaths.Length; i++) { await memory.ImportDocumentAsync(documentPaths[i], steps: Constants.PipelineWithoutSummary); } // Allow the user to ask questions forever. while (true) { Console.ForegroundColor = ConsoleColor.Green; Console.Write("\nQuestion: "); var userInput = Console.ReadLine() ?? string.Empty; Console.ForegroundColor = ConsoleColor.DarkGray; MemoryAnswer answer = await memory.AskAsync(userInput); Console.ForegroundColor = ConsoleColor.Yellow; Console.WriteLine($"Answer: {answer.Result}"); foreach (var source in answer.RelevantSources) { Console.WriteLine($"Source: {source.SourceName}."); } } } }
The text was updated successfully, but these errors were encountered:
@xbotter Would you like to look into this issue?
Sorry, something went wrong.
No branches or pull requests
Is it possible? I'd like something similar to this, which we can do when using a
ChatSession
:This is my code for a document ingestion chat bot, but I have to wait for the whole answer to complete before the text is visible.
The text was updated successfully, but these errors were encountered: