Skip to content

Enh/ap 24075 mistral ai llm selector node#62

Open
ya-hn wants to merge 3 commits intoenh/AP-24074-mistral-ai-authenticator-nodefrom
enh/AP-24075-mistral-ai-llm-selector-node
Open

Enh/ap 24075 mistral ai llm selector node#62
ya-hn wants to merge 3 commits intoenh/AP-24074-mistral-ai-authenticator-nodefrom
enh/AP-24075-mistral-ai-llm-selector-node

Conversation

@ya-hn
Copy link
Contributor

@ya-hn ya-hn commented Mar 17, 2026

Adds a Mistral AI LLM Selector node. Since all chat models are fetched from the API, the list in the model parameter gets quite large.

There is also a commit that renames old mentions of LLM Prompter (Conversation) and LLM Prompter (Table) which we likely missed when renaming the nodes to LLM Prompter and LLM Chat Prompter.

@ya-hn ya-hn requested a review from a team as a code owner March 17, 2026 14:41
@ya-hn ya-hn requested review from knime-ghub-bot and removed request for a team March 17, 2026 14:41
)


class MistralChatModelSettings(GeneralRemoteSettings):
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

GeneralRemoteSettings contains the parameter n_requests with since_version="5.3.0". I am not sure if that could lead to problems. But we also used GeneralRemoteSettings for other LLM Selectors that we added after 5.3.

min_value=0.01,
max_value=1.0,
is_advanced=True,
)
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For newer providers, we often don't have top_p parameters (anthropic, deepseek, gemini). If this is by choice to keep options simple, I can also create a new parameter group that does not extend GeneralRemoteSettings and does not have top_p.

temperature=data["temperature"],
top_p=data["top_p"],
max_tokens=data["max_tokens"],
n_requests=data.get("n_requests", 1),
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we need the fallback .get() here, but I was not sure because the parameter has the since_version==5.3.0 flag.

node_type=knext.NodeType.SOURCE,
icon_path=mistral_icon,
category=mistral_category,
keywords=["Mistral", "GenAI", "Gen AI", "Generative AI"],
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here I am not sure if there are Mistral specific keywords that would be useful.

"mistral-large-latest",
"mistral-small-latest",
"open-mistral-nemo",
]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This list of models was generated by copilot. I think the large and small models make sense, but I'm not sure about open-mistral-nemo. We could also remove it or extend the list with other models.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant