Filtered Corpus Training
Collection
All models from the paper "Filtered Corpus Training (FiCT) Shows...". Naming convention: `{filter}-{model}-{seed}`. • 47 items • Updated
How to use CLMBR/binding-domain-transformer-4 with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="CLMBR/binding-domain-transformer-4") # Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("CLMBR/binding-domain-transformer-4")
model = AutoModelForCausalLM.from_pretrained("CLMBR/binding-domain-transformer-4")How to use CLMBR/binding-domain-transformer-4 with vLLM:
# Install vLLM from pip:
pip install vllm
# Start the vLLM server:
vllm serve "CLMBR/binding-domain-transformer-4"
# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:8000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "CLMBR/binding-domain-transformer-4",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'docker model run hf.co/CLMBR/binding-domain-transformer-4
How to use CLMBR/binding-domain-transformer-4 with SGLang:
# Install SGLang from pip:
pip install sglang
# Start the SGLang server:
python3 -m sglang.launch_server \
--model-path "CLMBR/binding-domain-transformer-4" \
--host 0.0.0.0 \
--port 30000
# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:30000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "CLMBR/binding-domain-transformer-4",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'docker run --gpus all \
--shm-size 32g \
-p 30000:30000 \
-v ~/.cache/huggingface:/root/.cache/huggingface \
--env "HF_TOKEN=<secret>" \
--ipc=host \
lmsysorg/sglang:latest \
python3 -m sglang.launch_server \
--model-path "CLMBR/binding-domain-transformer-4" \
--host 0.0.0.0 \
--port 30000
# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:30000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "CLMBR/binding-domain-transformer-4",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'How to use CLMBR/binding-domain-transformer-4 with Docker Model Runner:
docker model run hf.co/CLMBR/binding-domain-transformer-4
This model is a fine-tuned version of on the None dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss |
|---|---|---|---|
| 4.226 | 0.03 | 76320 | 4.1915 |
| 4.0213 | 1.03 | 152640 | 4.0234 |
| 3.9161 | 0.03 | 228960 | 3.9497 |
| 3.8489 | 1.03 | 305280 | 3.9084 |
| 3.7957 | 0.03 | 381600 | 3.8838 |
| 3.7572 | 1.03 | 457920 | 3.8675 |
| 3.7211 | 0.03 | 534240 | 3.8573 |
| 3.6912 | 1.03 | 610560 | 3.8499 |
| 3.6594 | 0.03 | 686880 | 3.8459 |
| 3.6379 | 0.03 | 763200 | 3.8440 |
| 3.6134 | 1.03 | 839520 | 3.8428 |
| 3.5959 | 0.03 | 915840 | 3.8417 |
| 3.5766 | 1.03 | 992160 | 3.8407 |
| 3.557 | 0.03 | 1068480 | 3.8415 |
| 3.5417 | 1.03 | 1144800 | 3.8420 |
| 3.5296 | 0.03 | 1221120 | 3.8431 |
| 3.5121 | 1.03 | 1297440 | 3.8434 |
| 3.4985 | 0.03 | 1373760 | 3.8460 |
| 3.4868 | 1.03 | 1450080 | 3.8457 |
| 3.4784 | 0.03 | 1526400 | 3.8484 |
| 3.4709 | 1.03 | 1602720 | 3.8486 |
| 3.463 | 0.03 | 1679040 | 3.8508 |
| 3.4519 | 1.03 | 1755360 | 3.8514 |
| 3.4409 | 0.03 | 1831680 | 3.8525 |
| 3.4273 | 1.03 | 1908000 | 3.8540 |
| 3.4158 | 0.03 | 1984320 | 3.8543 |
| 3.4037 | 1.03 | 2060640 | 3.8560 |
| 3.3967 | 0.03 | 2136960 | 3.8576 |
| 3.385 | 1.03 | 2213280 | 3.8575 |
| 3.3721 | 0.03 | 2289600 | 3.8587 |
| 3.3609 | 1.03 | 2365920 | 3.8594 |
| 3.3544 | 0.03 | 2442240 | 3.8593 |
| 3.3399 | 0.03 | 2518560 | 3.8608 |
| 3.3327 | 1.03 | 2594880 | 3.8604 |
| 3.324 | 0.03 | 2671200 | 3.8607 |
| 3.3185 | 1.03 | 2747520 | 3.8611 |
| 3.3142 | 0.03 | 2823840 | 3.8610 |
| 3.3078 | 0.03 | 2900160 | 3.8606 |
| 3.3017 | 1.03 | 2976480 | 3.8596 |
| 3.2938 | 0.02 | 3052726 | 3.8582 |