LM Studio - Run Open Source AI Models Locally on Your Desktop
Overview
LM Studio is a desktop application developed by Element Labs that enables you to run large language models (LLMs) completely locally on your personal computer. Released in 2023, it has become a popular choice for developers and AI enthusiasts who want to experiment with models like gpt-oss, Llama, Qwen, Gemma, and DeepSeek without relying on cloud services. The application is free, cross-platform (macOS, Windows, Linux), and emphasizes privacy-first, on-device AI processing.
Top Recommended Resources
1. LM Studio - Local AI on your computer
- Direct download links for all supported platforms (macOS, Windows, Linux)
- Overview of supported models and developer SDKs (JavaScript and Python)
- Introduction to llmster, the recently released command-line tool for server deployments
- Clear emphasis on privacy features and offline capabilities
2. Get started with LM Studio | LM Studio Docs
- Detailed system requirements for each operating system
- Three-step quickstart: download a model, load it to memory, and start chatting
- Explanation of offline capabilities and when internet is required (only for downloading models)
- Links to advanced features like MCP servers, presets, and speculative decoding
3. How to run gpt-oss locally with LM Studio
- Hardware requirements for different model sizes (20B requires 16GB+ VRAM, 120B requires 60GB+)
- Code examples in both Python and TypeScript for API integration
- Practical tool-calling examples showing how to implement local function execution
- Instructions for pointing the OpenAI SDK to LM Studio's local endpoint (localhost:1234/v1)
4. First look: Run LLMs locally with LM Studio | InfoWorld
- Real-world evaluation testing multiple models
- Honest assessment of integration limitations (manual configuration required for tools)
- Coverage of REST API capabilities including Anthropic-compatible endpoints
- Discussion of licensing considerations (proprietary but currently free)
5. Welcome to LM Studio Docs! | LM Studio Docs
- Complete sidebar navigation covering all aspects of the platform
- Developer sections for TypeScript/Python SDKs, CLI tools, and REST APIs
- Coverage of RAG (Retrieval-Augmented Generation) for document interaction
- Information on Model Context Protocol (MCP) server integration for extended functionality
My Recommendation
Start with the official homepage to download LM Studio, then work through the getting-started guide to run your first model. If you're a developer interested in integration, the OpenAI gpt-oss tutorial provides excellent practical examples. The InfoWorld review is worth reading for a realistic assessment of what LM Studio can and cannot do. For ongoing reference, bookmark the documentation portal—it's well-organized and covers everything from basic usage to advanced SDK integration.
LM Studio excels at making local LLM deployment accessible without sacrificing capability. Whether you're experimenting with AI privately, building applications with offline requirements, or simply want full control over your AI infrastructure, these resources will guide you from installation to advanced integration.