Instructions to use facebook/flava-full with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use facebook/flava-full with Transformers:
# Load model directly from transformers import AutoProcessor, AutoModelForPreTraining processor = AutoProcessor.from_pretrained("facebook/flava-full") model = AutoModelForPreTraining.from_pretrained("facebook/flava-full") - Notebooks
- Google Colab
- Kaggle
Adding `safetensors` variant of this model
#7 opened 8 months ago
by
SFconvertbot
Adding `safetensors` variant of this model
#6 opened 9 months ago
by
SFconvertbot
Adding `safetensors` variant of this model
#5 opened about 2 years ago
by
SFconvertbot
Incorrect comments in example
#4 opened about 2 years ago
by
mjspeck
[AUTOMATED] Model Memory Requirements
#2 opened over 2 years ago
by
model-sizer-bot