Skip to content

model upgrade for hard code#1306

Merged
yileicn merged 4 commits intoSciSharp:masterfrom
adenchen123:master
Mar 11, 2026
Merged

model upgrade for hard code#1306
yileicn merged 4 commits intoSciSharp:masterfrom
adenchen123:master

Conversation

@adenchen123
Copy link
Contributor

No description provided.

@qodo-code-review
Copy link
Contributor

Review Summary by Qodo

Implement dynamic LLM model upgrade mechanism via ISettingService

✨ Enhancement

Grey Divider

Walkthroughs

Description
• Add dynamic LLM model upgrade mechanism via ISettingService
• Replace hardcoded model strings with Gpt4xModelConstants references
• Add new GPT-4o model variants to constants (search, transcribe, TTS)
• Update test configurations to use GPT-5 models
Diagram
flowchart LR
  A["Hardcoded Model Strings"] -->|Replace with| B["Gpt4xModelConstants"]
  B -->|Upgrade via| C["ISettingService.GetUpgradeModel"]
  C -->|Returns| D["Upgraded Model Name"]
  E["New Model Variants"] -->|Added to| B
  F["Multiple Services"] -->|Integrate| C
Loading

Grey Divider

File Changes

1. src/Infrastructure/BotSharp.Abstraction/Models/Gpt4xModelConstants.cs ✨ Enhancement +4/-0

Add new GPT-4o model variant constants

src/Infrastructure/BotSharp.Abstraction/Models/Gpt4xModelConstants.cs


2. src/Infrastructure/BotSharp.Abstraction/Realtime/Models/ModelTurnDetection.cs ✨ Enhancement +1/-1

Replace hardcoded transcribe model with constant

src/Infrastructure/BotSharp.Abstraction/Realtime/Models/ModelTurnDetection.cs


3. src/Infrastructure/BotSharp.Abstraction/Realtime/Settings/RealtimeModelSettings.cs ✨ Enhancement +1/-1

Replace hardcoded realtime model with constant

src/Infrastructure/BotSharp.Abstraction/Realtime/Settings/RealtimeModelSettings.cs


View more (19)
4. src/Infrastructure/BotSharp.Core.Realtime/Services/RealtimeHub.cs ✨ Enhancement +3/-1

Integrate ISettingService for model upgrade

src/Infrastructure/BotSharp.Core.Realtime/Services/RealtimeHub.cs


5. src/Infrastructure/BotSharp.Core/Conversations/Services/ConversationService.Summary.cs ✨ Enhancement +4/-1

Use ISettingService for dynamic model selection

src/Infrastructure/BotSharp.Core/Conversations/Services/ConversationService.Summary.cs


6. src/Infrastructure/BotSharp.Core/Files/Services/Instruct/FileInstructService.Image.cs ✨ Enhancement +4/-1

Apply model upgrade mechanism to image reading

src/Infrastructure/BotSharp.Core/Files/Services/Instruct/FileInstructService.Image.cs


7. src/Infrastructure/BotSharp.Core/Infrastructures/CompletionProvider.cs ✨ Enhancement +6/-2

Integrate ISettingService in audio and realtime providers

src/Infrastructure/BotSharp.Core/Infrastructures/CompletionProvider.cs


8. src/Infrastructure/BotSharp.Core/Instructs/Services/InstructService.Instruct.cs ✨ Enhancement +4/-1

Use ISettingService for AI response model selection

src/Infrastructure/BotSharp.Core/Instructs/Services/InstructService.Instruct.cs


9. src/Infrastructure/BotSharp.Core/Shared/JsonRepairService.cs ✨ Enhancement +4/-1

Apply model upgrade to JSON repair LLM calls

src/Infrastructure/BotSharp.Core/Shared/JsonRepairService.cs


10. src/Infrastructure/BotSharp.Core/WebSearch/Functions/WebIntelligentSearchFn.cs ✨ Enhancement +5/-2

Use ISettingService for web search model upgrade

src/Infrastructure/BotSharp.Core/WebSearch/Functions/WebIntelligentSearchFn.cs


11. src/Plugins/BotSharp.Plugin.AudioHandler/Functions/ReadAudioFn.cs ✨ Enhancement +5/-1

Integrate ISettingService for audio transcription

src/Plugins/BotSharp.Plugin.AudioHandler/Functions/ReadAudioFn.cs


12. src/Plugins/BotSharp.Plugin.EmailHandler/Functions/HandleEmailReaderFn.cs ✨ Enhancement +4/-1

Apply model upgrade to email handler LLM calls

src/Plugins/BotSharp.Plugin.EmailHandler/Functions/HandleEmailReaderFn.cs


13. src/Plugins/BotSharp.Plugin.GoogleAI/Providers/Realtime/RealTimeCompletionProvider.cs ✨ Enhancement +3/-1

Use ISettingService for Google AI realtime models

src/Plugins/BotSharp.Plugin.GoogleAI/Providers/Realtime/RealTimeCompletionProvider.cs


14. src/Plugins/BotSharp.Plugin.ImageHandler/Helpers/AiResponseHelper.cs ✨ Enhancement +4/-1

Apply model upgrade to image generation responses

src/Plugins/BotSharp.Plugin.ImageHandler/Helpers/AiResponseHelper.cs


15. src/Plugins/BotSharp.Plugin.OpenAI/Models/Realtime/RealtimeSessionBody.cs ✨ Enhancement +3/-1

Replace hardcoded transcribe model with constant

src/Plugins/BotSharp.Plugin.OpenAI/Models/Realtime/RealtimeSessionBody.cs


16. src/Plugins/BotSharp.Plugin.OpenAI/Providers/Realtime/RealTimeCompletionProvider.cs ✨ Enhancement +10/-3

Integrate ISettingService in OpenAI realtime provider

src/Plugins/BotSharp.Plugin.OpenAI/Providers/Realtime/RealTimeCompletionProvider.cs


17. src/Plugins/BotSharp.Plugin.Planner/Sequential/SequentialPlanner.cs ✨ Enhancement +5/-1

Use ISettingService for planner model selection

src/Plugins/BotSharp.Plugin.Planner/Sequential/SequentialPlanner.cs


18. src/Plugins/BotSharp.Plugin.SqlDriver/Functions/SqlValidateFn.cs ✨ Enhancement +3/-2

Apply model upgrade to SQL validation

src/Plugins/BotSharp.Plugin.SqlDriver/Functions/SqlValidateFn.cs


19. src/Plugins/BotSharp.Plugin.SqlDriver/Services/DbKnowledgeService.cs ✨ Enhancement +3/-1

Use ISettingService for database knowledge import

src/Plugins/BotSharp.Plugin.SqlDriver/Services/DbKnowledgeService.cs


20. tests/BotSharp.LLM.Tests/Core/LLMProvider.cs ⚙️ Configuration changes +1/-1

Update test model to GPT-5-mini

tests/BotSharp.LLM.Tests/Core/LLMProvider.cs


21. src/WebStarter/appsettings.json ⚙️ Configuration changes +1/-1

Update default agent model to GPT-5-nano

src/WebStarter/appsettings.json


22. tests/BotSharp.LLM.Tests/appsettings.json ⚙️ Configuration changes +1/-1

Update test agent model to GPT-5-mini

tests/BotSharp.LLM.Tests/appsettings.json


Grey Divider

Qodo Logo

@qodo-code-review
Copy link
Contributor

qodo-code-review bot commented Mar 11, 2026

Code Review by Qodo

🐞 Bugs (2) 📘 Rule violations (3) 📎 Requirement gaps (0)

Grey Divider


Action required

1. InputAudioTranscription null dereference 📘 Rule violation ⛯ Reliability
Description
realtimeModelSettings.InputAudioTranscription.Model is dereferenced and passed into
GetUpgradeModel without checking whether InputAudioTranscription (or its Model) is null. This
can throw at runtime when that settings section is missing/partial.
Code

src/Plugins/BotSharp.Plugin.OpenAI/Providers/Realtime/RealTimeCompletionProvider.cs[R377-381]

            sessionUpdate.session.InputAudioTranscription = new InputAudioTranscription
            {
-                Model = realtimeModelSettings.InputAudioTranscription.Model,
+                Model = settingService.GetUpgradeModel(realtimeModelSettings.InputAudioTranscription.Model),
                Language = realtimeModelSettings.InputAudioTranscription.Language,
Evidence
Compliance ID 3 requires guarding optional configuration objects before dereferencing. The modified
assignment directly dereferences realtimeModelSettings.InputAudioTranscription.Model and upgrades
it without any null/empty checks or fallback.

src/Plugins/BotSharp.Plugin.OpenAI/Providers/Realtime/RealTimeCompletionProvider.cs[377-381]
Best Practice: Learned patterns

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
The realtime session update dereferences `realtimeModelSettings.InputAudioTranscription` without null checks, then passes the model into `GetUpgradeModel`.

## Issue Context
Configuration sections for realtime input audio transcription can be absent or partially configured; the code should behave predictably (e.g., fall back to a known default model).

## Fix Focus Areas
- src/Plugins/BotSharp.Plugin.OpenAI/Providers/Realtime/RealTimeCompletionProvider.cs[377-381]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


2. Unknown default model 🐞 Bug ⛯ Reliability
Description
src/WebStarter/appsettings.json sets Agent:LlmConfig:Model to "gpt-5-nano", but the configured
OpenAI model list in the same file only includes gpt-4o-* and gpt-5 / gpt-5.1 / gpt-5.2 entries.
When a flow uses CompletionProvider.GetCompletion (e.g., InstructService.Execute),
ILlmProviderService.GetSetting can return null for this model and CompletionProvider dereferences
settings.Type, causing a NullReferenceException.
Code

src/WebStarter/appsettings.json[650]

+      "Model": "gpt-5-nano"
Evidence
The PR changes the default agent model to gpt-5-nano, but the OpenAI provider model list shown in
the same config does not define that model name; GetSetting returns null for unknown models and
GetCompletion immediately dereferences settings.Type. InstructService.Execute uses GetCompletion
with agent.LlmConfig, so the misconfiguration can crash summarization/instruction flows at runtime.

src/WebStarter/appsettings.json[643-651]
src/WebStarter/appsettings.json[151-200]
src/WebStarter/appsettings.json[480-567]
src/WebStarter/appsettings.json[568-568]
src/Infrastructure/BotSharp.Core/Infrastructures/LlmProviderService.cs[77-95]
src/Infrastructure/BotSharp.Core/Infrastructures/CompletionProvider.cs[16-26]
src/Infrastructure/BotSharp.Core/Instructs/Services/InstructService.Execute.cs[244-251]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

### Issue description
`Agent:LlmConfig:Model` is set to `gpt-5-nano`, but the OpenAI provider’s configured model list does not define `gpt-5-nano`. When code paths call `CompletionProvider.GetCompletion`, `ILlmProviderService.GetSetting(provider, model)` can return `null` and `CompletionProvider` dereferences `settings.Type`, causing a runtime `NullReferenceException`.

### Issue Context
The default agent model in `WebStarter` config is used broadly across instruction/execution flows (e.g., `InstructService.Execute`). The provider-model registry (`LlmProviders`) must include any model used by `GetCompletion`.

### Fix Focus Areas
- src/WebStarter/appsettings.json[643-651]
- src/WebStarter/appsettings.json[151-568]
- src/Infrastructure/BotSharp.Core/Infrastructures/CompletionProvider.cs[16-26]
- src/Infrastructure/BotSharp.Core/Infrastructures/LlmProviderService.cs[77-95]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


3. Broken test model config 🐞 Bug ✓ Correctness
Description
tests/BotSharp.LLM.Tests/appsettings.json sets Agent:LlmConfig Provider=azure-openai and
Model=gpt-5-mini, but the azure-openai provider models listed in the same config only include
gpt-35-* entries. Any test path that uses CompletionProvider.GetCompletion (via AgentSettings) can
fail model lookup and then crash due to dereferencing a null model setting.
Code

tests/BotSharp.LLM.Tests/appsettings.json[171]

+      "Model": "gpt-5-mini"
Evidence
The test configuration now points the default agent at an Azure OpenAI model name that is not
present in the configured azure-openai model list. Since LlmProviderService returns null for missing
models and CompletionProvider.GetCompletion dereferences settings.Type without a null-check, tests
that rely on AgentSettings defaults can fail/crash.

tests/BotSharp.LLM.Tests/appsettings.json[45-64]
tests/BotSharp.LLM.Tests/appsettings.json[164-172]
src/Infrastructure/BotSharp.Core/Infrastructures/LlmProviderService.cs[89-94]
src/Infrastructure/BotSharp.Core/Infrastructures/CompletionProvider.cs[20-26]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

### Issue description
The test `Agent:LlmConfig` points to an azure-openai model (`gpt-5-mini`) that is not present in the configured azure-openai `LlmProviders` model list. This can break any path that relies on `ILlmProviderService.GetSetting()` and `CompletionProvider.GetCompletion()`.

### Issue Context
`LlmProviderService.GetSetting` returns `null` when the model name is missing, while `CompletionProvider.GetCompletion` dereferences `settings.Type` without checking for `null`.

### Fix Focus Areas
- tests/BotSharp.LLM.Tests/appsettings.json[45-64]
- tests/BotSharp.LLM.Tests/appsettings.json[164-172]
- src/Infrastructure/BotSharp.Core/Infrastructures/CompletionProvider.cs[16-26]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools



Remediation recommended

4. GetUpgradeModel(_settings.Model) unguarded 📘 Rule violation ⛯ Reliability
Description
_settings.Model may be null/empty from configuration, but it is passed directly into
GetUpgradeModel without a guard, risking an exception or propagating an invalid model value. Add a
null/empty check and a safe fallback before/after upgrading.
Code

src/Infrastructure/BotSharp.Core.Realtime/Services/RealtimeHub.cs[R196-204]

+        var settingService = _services.GetRequiredService<ISettingService>();

        if (!string.IsNullOrEmpty(provider) && !string.IsNullOrEmpty(model))
        {
            return (provider, model);
        }

        provider = _settings.Provider;
-        model = _settings.Model;
+        model = settingService.GetUpgradeModel(_settings.Model);
Evidence
Compliance ID 3 requires null/empty guards at system boundaries. The change introduces a boundary
call (GetUpgradeModel) using _settings.Model without validating the input or ensuring a
non-empty output before later use.

src/Infrastructure/BotSharp.Core.Realtime/Services/RealtimeHub.cs[196-204]
Best Practice: Learned patterns

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
`_settings.Model` can be null/empty at the configuration boundary, but the new code calls `GetUpgradeModel(_settings.Model)` without guarding inputs/outputs.

## Issue Context
This method is part of selecting provider/model for realtime hub connections; invalid model values can cause runtime failures or downstream provider errors.

## Fix Focus Areas
- src/Infrastructure/BotSharp.Core.Realtime/Services/RealtimeHub.cs[196-204]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


5. SetModelName upgrade lacks fallback 📘 Rule violation ⛯ Reliability
Description
SetModelName is now fed by GetUpgradeModel(...) without validating the upgraded value, which can
propagate null/empty into the transcriber implementation. Add a safe fallback to a known-good model
string when upgrade returns null/empty.
Code

src/Infrastructure/BotSharp.Core/Infrastructures/CompletionProvider.cs[R145-148]

+        var settingService = services.GetRequiredService<ISettingService>();
        var completions = services.GetServices<IAudioTranscription>();
        var completer = completions.FirstOrDefault(x => x.Provider == (provider ?? "openai"));
        if (completer == null)
Evidence
Compliance ID 3 requires safe fallbacks at boundaries. The updated code replaces a guaranteed
non-null literal with the output of GetUpgradeModel(...) and passes it directly to SetModelName
without ensuring it is non-empty.

src/Infrastructure/BotSharp.Core/Infrastructures/CompletionProvider.cs[145-156]
Best Practice: Learned patterns

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

## Issue description
`GetUpgradeModel(...)` output is passed into `SetModelName` without checking for null/empty, risking invalid model assignment.

## Issue Context
This is a system boundary into provider implementations (`IAudioTranscription`) where invalid configuration should not cause crashes.

## Fix Focus Areas
- src/Infrastructure/BotSharp.Core/Infrastructures/CompletionProvider.cs[145-156]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


Grey Divider

ⓘ The new review experience is currently in Beta. Learn more

Grey Divider

Qodo Logo

Comment on lines 377 to 381

sessionUpdate.session.InputAudioTranscription = new InputAudioTranscription
{
Model = realtimeModelSettings.InputAudioTranscription.Model,
Model = settingService.GetUpgradeModel(realtimeModelSettings.InputAudioTranscription.Model),
Language = realtimeModelSettings.InputAudioTranscription.Language,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Action required

1. inputaudiotranscription null dereference 📘 Rule violation ⛯ Reliability

realtimeModelSettings.InputAudioTranscription.Model is dereferenced and passed into
GetUpgradeModel without checking whether InputAudioTranscription (or its Model) is null. This
can throw at runtime when that settings section is missing/partial.
Agent Prompt
## Issue description
The realtime session update dereferences `realtimeModelSettings.InputAudioTranscription` without null checks, then passes the model into `GetUpgradeModel`.

## Issue Context
Configuration sections for realtime input audio transcription can be absent or partially configured; the code should behave predictably (e.g., fall back to a known default model).

## Fix Focus Areas
- src/Plugins/BotSharp.Plugin.OpenAI/Providers/Realtime/RealTimeCompletionProvider.cs[377-381]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools

"LlmConfig": {
"Provider": "openai",
"Model": "gpt-4.1-nano"
"Model": "gpt-5-nano"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Action required

2. Unknown default model 🐞 Bug ⛯ Reliability

src/WebStarter/appsettings.json sets Agent:LlmConfig:Model to "gpt-5-nano", but the configured
OpenAI model list in the same file only includes gpt-4o-* and gpt-5 / gpt-5.1 / gpt-5.2 entries.
When a flow uses CompletionProvider.GetCompletion (e.g., InstructService.Execute),
ILlmProviderService.GetSetting can return null for this model and CompletionProvider dereferences
settings.Type, causing a NullReferenceException.
Agent Prompt
### Issue description
`Agent:LlmConfig:Model` is set to `gpt-5-nano`, but the OpenAI provider’s configured model list does not define `gpt-5-nano`. When code paths call `CompletionProvider.GetCompletion`, `ILlmProviderService.GetSetting(provider, model)` can return `null` and `CompletionProvider` dereferences `settings.Type`, causing a runtime `NullReferenceException`.

### Issue Context
The default agent model in `WebStarter` config is used broadly across instruction/execution flows (e.g., `InstructService.Execute`). The provider-model registry (`LlmProviders`) must include any model used by `GetCompletion`.

### Fix Focus Areas
- src/WebStarter/appsettings.json[643-651]
- src/WebStarter/appsettings.json[151-568]
- src/Infrastructure/BotSharp.Core/Infrastructures/CompletionProvider.cs[16-26]
- src/Infrastructure/BotSharp.Core/Infrastructures/LlmProviderService.cs[77-95]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools

"LlmConfig": {
"Provider": "azure-openai",
"Model": "gpt-4o-mini"
"Model": "gpt-5-mini"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Action required

3. Broken test model config 🐞 Bug ✓ Correctness

tests/BotSharp.LLM.Tests/appsettings.json sets Agent:LlmConfig Provider=azure-openai and
Model=gpt-5-mini, but the azure-openai provider models listed in the same config only include
gpt-35-* entries. Any test path that uses CompletionProvider.GetCompletion (via AgentSettings) can
fail model lookup and then crash due to dereferencing a null model setting.
Agent Prompt
### Issue description
The test `Agent:LlmConfig` points to an azure-openai model (`gpt-5-mini`) that is not present in the configured azure-openai `LlmProviders` model list. This can break any path that relies on `ILlmProviderService.GetSetting()` and `CompletionProvider.GetCompletion()`.

### Issue Context
`LlmProviderService.GetSetting` returns `null` when the model name is missing, while `CompletionProvider.GetCompletion` dereferences `settings.Type` without checking for `null`.

### Fix Focus Areas
- tests/BotSharp.LLM.Tests/appsettings.json[45-64]
- tests/BotSharp.LLM.Tests/appsettings.json[164-172]
- src/Infrastructure/BotSharp.Core/Infrastructures/CompletionProvider.cs[16-26]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools

@JackJiang1234
Copy link
Contributor

reviewed

@yileicn yileicn merged commit f2577c8 into SciSharp:master Mar 11, 2026
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants