Skip to content

ruvllm-wasm: npm package is placeholder — WASM binary not compiled or included #238

@stuinfla

Description

@stuinfla

Summary

The @ruvector/ruvllm-wasm@0.1.0 npm package is published and installable, but it contains only TypeScript interface stubs — the actual WASM binary from the crates/ruvllm-wasm Rust crate is not compiled or included.

Evidence

The loadModel() method in dist/index.js (line 170-188):

async loadModel(source, options) {
    this.status = 'loading';
    // Placeholder - actual implementation requires WASM binary
    console.log('Loading model from:', typeof source === 'string' ? source : 'ArrayBuffer');
    console.log('Note: Full model loading requires the ruvllm-wasm binary.');
    console.log('Build from: crates/ruvllm-wasm');
    this.status = 'ready';
    return { name: 'placeholder', /* ... */ };
}

The generate() method (line 192-205):

async generate(prompt, config, onToken) {
    console.log('Note: Full generation requires the ruvllm-wasm binary.');
    return {
        text: '[Placeholder - build ruvllm-wasm crate for actual inference]',
        stats: { tokensGenerated: 0, /* ... */ },
    };
}

Comment on line 131-133:

/**
 * RuvLLM WASM class placeholder
 * Full implementation requires WASM binary from ruvllm-wasm crate
 */

Impact

  • The README documents a full working API (WebGPU, streaming, chat, IndexedDB caching)
  • npm install @ruvector/ruvllm-wasm succeeds but the package does nothing
  • Developers integrating this into browser apps (like our AfterCare healthcare PWA) discover at runtime that inference returns placeholder text
  • The package has zero dependencies (32.5 KB unpacked) because the WASM binary isn't included

Expected Behavior

The npm package should include the compiled WASM binary from crates/ruvllm-wasm, or the package should be marked as alpha/experimental with a clear warning that it requires manual WASM compilation.

Request

  1. Compile the crates/ruvllm-wasm Rust crate with wasm-pack build --target web
  2. Include the resulting .wasm binary in the npm package
  3. Wire up loadModel(), generate(), and chat() to call the actual WASM functions
  4. Publish as @ruvector/ruvllm-wasm@0.2.0

Context

We're building Bridgepoint AfterCare, a 100% offline healthcare PWA that runs an LLM entirely in-browser. We're currently using wllama (llama.cpp WASM) but would prefer to use ruvllm-wasm for native RuVector/RVF integration. We already use @ruvector/rvf-wasm for our unified HNSW knowledge base (196 vectors — drugs + EMR + medical knowledge).

Environment

  • @ruvector/ruvllm-wasm@0.1.0
  • Published 1 month ago
  • Node 24.x / Browser (Chrome 133, Safari 18)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions