Announcing vLLM-Omni: Easy, Fast, and Cheap Omni-Modality Model Serving Posted on December 12, 2025 by Ivan Ortega