Qwen3-Next: 80B instruct LLM with ultra-long context up to 1M tokens
A category-based approach to exploring film data.
Large-scale xAI model for local inference with SGLang, Grok-2.5
Powerful 14B LLM with strong instruction and long-text handling
Open Finance and Data Residency Platform
ChatBot for WordPress WPBot Lite
Robust BERT-based model for English with improved MLM training
Python Computer Vision & Video Analytics Framework With Batteries Incl
Qwen2.5-VL-3B-Instruct: Multimodal model for chat, vision & video
Portuguese ASR model fine-tuned on XLSR-53 for 16kHz audio input
High-performance MoE model with MLA, MTP, and multilingual reasoning
High-efficiency reasoning and agentic intelligence model
Instruction-tuned 1.2B LLM for multilingual text generation by Meta
ClinicalBERT model trained on MIMIC notes for clinical NLP tasks
Small 3B-base multimodal model ideal for custom AI on edge hardware