LM Studio is a desktop application designed to run and experiment with Large Language Models (LLM) locally. The application enables the execution of compatible GGUF models, including Llama, Mistral, Phi, Gemma, and StarCoder, without requiring an internet connection.
The application integrates functionalities for local model management, including: built-in chat interface, programming interface compatible with the OpenAI standard, direct search and download system from Hugging Face repositories, and parameter configuration tools.
LM Studio operates without collecting usage data and keeps all information on the local device. The application automatically detects and distributes workloads between GPU and CPU, with manual configuration options. Minimum requirements include Mac M1/M2/M3 processors, or Windows/Linux computers with AVX2 instruction support.
The software incorporates document processing capabilities through RAG (Retrieval Augmented Generation) for PDF and text files, with a 30MB file limit. The interface allows organizing conversations in folders, displaying token count, customizing prompt templates, and managing multiple chat generations.
The user interface offers four visual themes (Dark, Light, Sepia, and Dark Retro) and is compatible with system dark mode. The application is available in English, Spanish, German, Russian, Turkish, and Norwegian, with support for language expansion through community contributions.
Advanced technical features include models for converting text into numerical representations, ability to process and analyze images in conversations, progress indicator for prompt processing, and the capability to serve multiple models on local network.