404 error on the ree inference endpoint with model all-MiniLM-L6-v2

i am a newbie using this endpoint for free to generate embeddings… “https://huggingface.co/proxy/api-inference.huggingface.co/pipeline/feature-extraction/sentence-transformers/all-MiniLM-L6-v2

getting a 404 consistently …going through prev ious posts, i got nothing concrete except the issue seems to be on and off,can someone kindly update when this can be resolved…i was so excited to learn but the bubble has burst because of this.

1 Like

The usage (URL, etc.) has changed considerably. Here is the new usage.

1 Like

thanksa mil !
tried the following python call

embedding_url = “https://huggingface.co/proxy/router.huggingface.co/hf-inference/models/sentence-transformers/all-MiniLM-L6-v2/pipeline/feature-extraction
response = requests.post(embedding_url, headers, json={“inputs”: text})
if response.status_code != 200:
raise ValueError(f"Request failed with status code {response.status_code}: {response.text}")

response after multiple new tokens…not able to get past this 401…its not possible to be successful in a free account ?

401

Unauthorized access. Please check your credentials or authorization

1 Like

It may be necessary to pass a header containing a token by requests.