Skip to main content

Documentation Index

Fetch the complete documentation index at: https://liquidai-link-snapshot-contract.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

← Back to Text Models LFM2-350M is Liquid AI’s smallest text model, designed for edge devices with strict memory and compute constraints. Delivers surprisingly strong performance for its size, making it ideal for low-latency applications.

Specifications

PropertyValue
Parameters350M
Context Length32K tokens
ArchitectureLFM2 (Dense)

Ultra-Light

Minimal memory and compute footprint

Low Latency

Fastest inference in the LFM family

Edge-Ready

Runs on IoT and embedded devices

Quick Start