About

This repository provides a model compiled and optimized for Mobilint NPU hardware.
The model is packaged for deployment on Mobilint’s acceleration stack and is intended to be used within that environment.

Downloads last month
25
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for mobilint/whisper-small

Quantized
(28)
this model

Collection including mobilint/whisper-small