Thinktecture Logo

Solving “cannot find function” errors after updating WebLLM

Author: Christian Liebel • Published: 27.02.2024 • Category: AI

In my last blog post, I explained how to solve common build-time errors when adding WebLLM to your application. But some errors can also arise during runtime. Among these, the following issues stand out as particularly frequent:

ERROR Error: Cannot find global function vm.builtin.apply_presence_and_frequency_penalty
    at Module.getGlobalFuncInternal (index.js:2372:23)
Code language: CSS (css)

Or:

ERROR Error: Cannot find function _initialize_effect
    at Module.getFunction (index.js:1922:23)
    at VirtualMachine.getFunction (index.js:2052:29)
    at new LLMChatPipeline (index.js:4160:36)
Code language: CSS (css)

If you encounter any of these errors, it’s likely due to a recent update of WebLLM. While the JavaScript portion of WebLLM is distributed with your application, the model-specific WebAssembly binary is stored in the client-side cache on the user’s device. This could lead to scenarios where the versions become incompatible, for example, when the WebAssembly binary was downloaded using an older version of WebLLM.

Manually clearing the cache

This error can be resolved by removing the WebAssembly binary from the cache and downloading it again. This can be done via the developer tools:

  1. Press F12 to open the Developer Tools
  2. Go to the “Application” tab
  3. Select “Cache storage”
  4. Find the “webllm/wasm” entry
  5. Delete the WebAssembly binary for your model

Programmatically clearing the cache

You can also clear the cache via JavaScript. To remove the entire cache, call caches.delete('webllm/wasm').

From version 0.2.24, Web LLM provides a helper function that deletes the Wasm binary for a specific model called deleteModelWasmInCache(). It takes the local ID from the model list as a parameter, e.g. deleteModelWasmInCache('Llama-2-7b-chat-hf-q4f32_1').

On the next attempt to initialize the LLM pipeline, the new WebAssembly binary will be downloaded, and the error should be resolved.

A potential workaround

Unfortunately, it is hard to detect if the binary in the cache is compatible with the current WebLLM version. To me, this error even happened on the official website of WebLLM, and it would be up to the authors of the WebLLM library to introduce a cache invalidation mechanism. As a workaround, you could add a try/catch block around the logic to initialize the pipeline, check if the error message contains Cannot find global function or Cannot find function, then clear the cache and try to initialize the pipeline again. I created an issue with the authors of WebLLM to resolve the problem.

Aktuelle Research-Insights unserer Experten für Sie

Lesen Sie, was unsere Experten bei ihrem Research bewegt und melden Sie sich zu unserem kostenlosen Thinktecture Labs-Newsletter an.

Labs-Newsletter Anmeldung

Christian Liebel

I am a cross-platform development enthusiast thrilled by the opportunities offered by modern web technologies: I help enterprises and independent software vendors to develop modern, cross-platform business applications based on Angular. Being a Microsoft MVP and Google GDE, I speak about Progressive Web Apps at user groups and conferences, both national and international. As a member of the W3C WebApps working group, I help to move the web forward.

More about me →