Mozilla has officially released Firefox 142.0 to the stable channel, and it is expected to be fully available to users on Tuesday. Although this update does not bring major functional innovations, changes in underlying support and developer tools may spark widespread discussions about browser extensions and local AI integration.

Firefox 142 is an August regular major update, focusing on stability optimization and feature improvements. Notable improvements include: enhanced drag-and-drop support for Blob images, improved scrolling smoothness of bookmark properties dialog, and multiple performance tweaks aimed at enhancing overall user experience.

For developers, this version introduces two key Web APIs: Prioritized Task Scheduling API, which allows more precise control over task execution priority, optimizing page responsiveness; and URLPattern API, which provides a standardized syntax for matching and parsing URLs, simplifying routing logic implementation and improving front-end development efficiency.

Firefox

However, what truly attracted attention is Firefox's support for integrating local large language models (LLMs). Firefox 142 officially enables **wllama API extension support**, allowing developers to embed local LLM functionality based on WebAssembly directly into browser extensions. The wllama project provides browser-side bindings for Llama.cpp, enabling private AI inference without relying on the cloud, paving the way for "local AI browser extensions."

This move strengthens privacy and offline processing capabilities, but it may also raise controversy—some users worry that the "AI" label might be misused or have concerns about the boundaries of extension permissions. Mozilla has opened a "enable wllama for extensions" entry in its bug tracking system, allowing community supervision and feedback.

As Firefox continues to explore localized intelligent integration, the 142 version may become another key milestone in the browser's evolution toward an "AI-enhanced tool."