-
Notifications
You must be signed in to change notification settings - Fork 342
Description
Summary
The @ruvector/ruvllm-wasm@0.1.0 npm package is published and installable, but it contains only TypeScript interface stubs — the actual WASM binary from the crates/ruvllm-wasm Rust crate is not compiled or included.
Evidence
The loadModel() method in dist/index.js (line 170-188):
async loadModel(source, options) {
this.status = 'loading';
// Placeholder - actual implementation requires WASM binary
console.log('Loading model from:', typeof source === 'string' ? source : 'ArrayBuffer');
console.log('Note: Full model loading requires the ruvllm-wasm binary.');
console.log('Build from: crates/ruvllm-wasm');
this.status = 'ready';
return { name: 'placeholder', /* ... */ };
}The generate() method (line 192-205):
async generate(prompt, config, onToken) {
console.log('Note: Full generation requires the ruvllm-wasm binary.');
return {
text: '[Placeholder - build ruvllm-wasm crate for actual inference]',
stats: { tokensGenerated: 0, /* ... */ },
};
}Comment on line 131-133:
/**
* RuvLLM WASM class placeholder
* Full implementation requires WASM binary from ruvllm-wasm crate
*/Impact
- The README documents a full working API (WebGPU, streaming, chat, IndexedDB caching)
npm install @ruvector/ruvllm-wasmsucceeds but the package does nothing- Developers integrating this into browser apps (like our AfterCare healthcare PWA) discover at runtime that inference returns placeholder text
- The package has zero dependencies (32.5 KB unpacked) because the WASM binary isn't included
Expected Behavior
The npm package should include the compiled WASM binary from crates/ruvllm-wasm, or the package should be marked as alpha/experimental with a clear warning that it requires manual WASM compilation.
Request
- Compile the
crates/ruvllm-wasmRust crate withwasm-pack build --target web - Include the resulting
.wasmbinary in the npm package - Wire up
loadModel(),generate(), andchat()to call the actual WASM functions - Publish as
@ruvector/ruvllm-wasm@0.2.0
Context
We're building Bridgepoint AfterCare, a 100% offline healthcare PWA that runs an LLM entirely in-browser. We're currently using wllama (llama.cpp WASM) but would prefer to use ruvllm-wasm for native RuVector/RVF integration. We already use @ruvector/rvf-wasm for our unified HNSW knowledge base (196 vectors — drugs + EMR + medical knowledge).
Environment
@ruvector/ruvllm-wasm@0.1.0- Published 1 month ago
- Node 24.x / Browser (Chrome 133, Safari 18)