Enh/ap 24075 mistral ai llm selector node#62
Enh/ap 24075 mistral ai llm selector node#62ya-hn wants to merge 3 commits intoenh/AP-24074-mistral-ai-authenticator-nodefrom
Conversation
| ) | ||
|
|
||
|
|
||
| class MistralChatModelSettings(GeneralRemoteSettings): |
There was a problem hiding this comment.
GeneralRemoteSettings contains the parameter n_requests with since_version="5.3.0". I am not sure if that could lead to problems. But we also used GeneralRemoteSettings for other LLM Selectors that we added after 5.3.
| min_value=0.01, | ||
| max_value=1.0, | ||
| is_advanced=True, | ||
| ) |
There was a problem hiding this comment.
For newer providers, we often don't have top_p parameters (anthropic, deepseek, gemini). If this is by choice to keep options simple, I can also create a new parameter group that does not extend GeneralRemoteSettings and does not have top_p.
| temperature=data["temperature"], | ||
| top_p=data["top_p"], | ||
| max_tokens=data["max_tokens"], | ||
| n_requests=data.get("n_requests", 1), |
There was a problem hiding this comment.
I don't think we need the fallback .get() here, but I was not sure because the parameter has the since_version==5.3.0 flag.
| node_type=knext.NodeType.SOURCE, | ||
| icon_path=mistral_icon, | ||
| category=mistral_category, | ||
| keywords=["Mistral", "GenAI", "Gen AI", "Generative AI"], |
There was a problem hiding this comment.
Here I am not sure if there are Mistral specific keywords that would be useful.
| "mistral-large-latest", | ||
| "mistral-small-latest", | ||
| "open-mistral-nemo", | ||
| ] |
There was a problem hiding this comment.
This list of models was generated by copilot. I think the large and small models make sense, but I'm not sure about open-mistral-nemo. We could also remove it or extend the list with other models.
Adds a Mistral AI LLM Selector node. Since all chat models are fetched from the API, the list in the model parameter gets quite large.
There is also a commit that renames old mentions of LLM Prompter (Conversation) and LLM Prompter (Table) which we likely missed when renaming the nodes to LLM Prompter and LLM Chat Prompter.