Skip to content

Conversation

@dustincoates
Copy link
Contributor

Creating an inference endpoint for ELSER requires a model_id

Summary

The command currently on the semantic search with the inference API page (https://www.elastic.co/docs/solutions/search/semantic-search/semantic-search-inference) for creating an inference endpoint with ELSER return an error: "model_id must be provided"

This fixes that error.

Generative AI disclosure

  1. Did you use a generative AI (GenAI) tool to assist in creating this contribution?
  • Yes
  • No

Creating an inference endpoint for ELSER requires a model_id
@github-actions
Copy link
Contributor

✅ Vale Linting Results

No issues found on modified lines!


The Vale linter checks documentation changes against the Elastic Docs style guide.

To use Vale locally or report issues, refer to Elastic style guide for Vale.

@github-actions
Copy link
Contributor

🔍 Preview links for changed docs

Copy link
Contributor

@seanhandley seanhandley left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just one minor change - the conventions here for naming are not intuitive!

```console
PUT _inference/sparse_embedding/elser_embeddings <1>
{
"service": "elasticsearch",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
"service": "elasticsearch",
"service": "elastic",

{
"service": "elasticsearch",
"service_settings": {
"model_id": ".elser_model_2",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
"model_id": ".elser_model_2",
"model_id": "elser_model_2",

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants