Recently, Meta has launched an open-source Unity application called Spatial Lingo on its GitHub platform, aimed at helping users practice languages by recognizing objects in the real world. This application is based on Meta's various software development kits (SDKs), offering users a novel and engaging language learning experience.
The core concept of the Spatial Lingo application is to enhance users' language skills by guiding them to identify and describe objects in their surroundings using the target language. Under the guidance of a virtual character named Golly Gosh, users will practice vocabulary using familiar items around them. The application supports hand tracking and controller operations, enhancing user immersion.

The project is designed to provide Unity developers with an understanding of Meta's multiple features and development templates. By using the Perspective Camera API (PCA), voice SDK, and Mixed Reality Utility Kit (MRUK), developers can create richer mixed reality experiences. The main scene and several example scenes of Spatial Lingo demonstrate how to implement these features, including a gym scene, word cloud scene, and camera image scene.
To use Spatial Lingo, developers need to ensure they have installed Unity 6000.0.51f1 or higher and configured the corresponding Llama API key. The project also emphasizes security, reminding developers not to embed API keys in the application to prevent them from being extracted.
Spatial Lingo's multiple features make it a powerful tool for language learning. The application can recognize objects in the user's environment and generate dynamic language courses based on them. As users continue to build their language tree, the system will automatically generate verbs and adjectives related to objects, enriching the course content. In addition, the application has text-to-speech and transcription functions, supporting multiple languages, further enhancing interactivity.
This project is not only an attempt by Meta in the education field but also provides developers with a good open-source example, demonstrating how to combine mixed reality technology with language learning. Developers can access Spatial Lingo on GitHub to obtain the source code and related documentation, further exploring its potential.
Address: https://github.com/oculus-samples/Unity-SpatialLingo

