Integrate LLM Enhanced Commands Implementation for Query Generation#318
Integrate LLM Enhanced Commands Implementation for Query Generation#318
Conversation
There was a problem hiding this comment.
Pull Request Overview
This PR integrates LLM-enhanced query generation functionality into the DocumentDB extension, enabling AI-powered MongoDB query creation from natural language prompts. The implementation leverages GitHub Copilot's language models to generate queries based on collection schemas and user requests.
Key Changes:
- Replaced mock AI query generation with real LLM integration using GitHub Copilot
- Added comprehensive infrastructure for schema inference and prompt template management
- Implemented MongoDB client APIs for gathering collection metadata to support LLM context
Reviewed Changes
Copilot reviewed 10 out of 10 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
src/webviews/documentdb/collectionView/collectionViewRouter.ts |
Replaces mock AI logic with actual LLM query generation using telemetry and error handling |
src/utils/schemaInference.ts |
Implements schema inference from MongoDB documents including type detection and vector array support |
src/services/promptTemplateService.ts |
Provides template loading service with caching and custom template file support |
src/services/copilotService.ts |
Wraps VS Code's language model API for GitHub Copilot interaction |
src/documentdb/LlmEnhancedFeatureApis.ts |
Implements MongoDB APIs for explain plans, index operations, and sample document retrieval |
src/documentdb/ClustersClient.ts |
Exposes LLM-enhanced feature APIs through the cluster client and adds metadata caching |
src/commands/llmEnhancedCommands/queryGenerationCommands.ts |
Orchestrates query generation from natural language using schema inference and LLM calls |
src/commands/llmEnhancedCommands/promptTemplates.ts |
Contains prompt templates for different query types and model configuration |
package.json |
Adds configuration settings for custom prompt templates and caching |
l10n/bundle.l10n.json |
Adds localized strings for new error messages and user-facing text |
There was a problem hiding this comment.
- lets pretty-print our queries when there are > 1 fields in use.
- llm config: I wonder why for this prompt "find the top 10 stores, look at store features" it added a long list of project fields instead of just leaving it empty.
- we need to add cancel support
- we need to add more trace messages to the output channel
- let's address the metadata topic before merging it, otherwise we'll forget.
the other items can be scheduled for 0.6.1 and 0.6.2
package.json
Outdated
| "description": "The batch size to be used when querying working with the shell.", | ||
| "default": 50 | ||
| }, | ||
| "documentDB.llm.findQueryPromptPath": { |
There was a problem hiding this comment.
let's find a better name for this category "LLM", @khelanmodi this is about the settings the user can configure. How about "DocumentDB Copilot", "DocumentDB AI Assistant", "AI Assistant"?
There was a problem hiding this comment.
@xingfan-git FYI: I changed it to AiAssistant. Your test env can break.
tnaum-ms
left a comment
There was a problem hiding this comment.
Requested changes were implemented.
Follow-up tickets/issues were created.
No description provided.