Skip to content
This repository was archived by the owner on Sep 12, 2024. It is now read-only.
This repository was archived by the owner on Sep 12, 2024. It is now read-only.

Segmentation fault #113

@ZGltYQ

Description

@ZGltYQ

I am using code from langchain example but got an error:

import { MemoryVectorStore } from "langchain/vectorstores/memory";
 import { LLamaEmbeddings } from "llama-node/dist/extensions/langchain.js";
 import { LLama } from "llama-node";
 import { LLamaCpp } from "llama-node/dist/llm/llama-cpp.js";
 import path from "path";
 
 const model = path.resolve(process.cwd(), "llama-2-7b-chat.ggmlv3.q8_0.bin");
 
 const llama = new LLama(LLamaCpp);
 
 const config = {
     path: model,
     enableLogging: true,
     nCtx: 1024,
     nParts: -1,
     seed: 0,
     f16Kv: false,
     logitsAll: false,
     vocabOnly: false,
     useMlock: false,
     embedding: true,
     useMmap: true,
 };
 
 llama.load(config);
 
 const run = async () => {
     // Load the docs into the vector store
     const vectorStore = await MemoryVectorStore.fromTexts(
         ["Hello world", "Bye bye", "hello nice world"],
         [{ id: 2 }, { id: 1 }, { id: 3 }],
         new LLamaEmbeddings({ maxConcurrency: 1 }, llama)
     );
 
  const resultOne = await vectorStore.similaritySearch("hello world", 1);

  console.log(resultOne);
 };
 
 run().catch(error => {
     console.log(error)
 });

output: [1] 37742 segmentation fault node index.js

additional info:
node: v18.17.0
"dependencies": {
"langchain": "^0.0.122",
"llama-node": "^0.1.6"
}
OS: Windows 11 (WSL ubuntu)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions