diff --git a/docs/get-started/tutorials/inference-ollama.mdx b/docs/get-started/tutorials/inference-ollama.mdx index 4fcc018a7..af84aa0ba 100644 --- a/docs/get-started/tutorials/inference-ollama.mdx +++ b/docs/get-started/tutorials/inference-ollama.mdx @@ -206,6 +206,6 @@ openshell provider get ollama ## Next Steps -- To learn more about managed inference, refer to [Index](/inference/about). +- To learn more about managed inference, refer to [About Inference Routing](/inference/about). - To configure a different self-hosted backend, refer to [Configure](/inference/configure). - To explore more community sandboxes, refer to [Community Sandboxes](/sandboxes/community-sandboxes). diff --git a/docs/get-started/tutorials/local-inference-lmstudio.mdx b/docs/get-started/tutorials/local-inference-lmstudio.mdx index 976770d1f..c3d1f94a8 100644 --- a/docs/get-started/tutorials/local-inference-lmstudio.mdx +++ b/docs/get-started/tutorials/local-inference-lmstudio.mdx @@ -210,5 +210,5 @@ openshell provider get lmstudio-anthropic ## Next Steps - To learn more about using the LM Studio CLI, refer to [LM Studio docs](https://lmstudio.ai/docs/cli) -- To learn more about managed inference, refer to [Index](/inference/about). +- To learn more about managed inference, refer to [About Inference Routing](/inference/about). - To configure a different self-hosted backend, refer to [Configure](/inference/configure). diff --git a/docs/inference/configure.mdx b/docs/inference/configure.mdx index 6e4332905..62c70aac6 100644 --- a/docs/inference/configure.mdx +++ b/docs/inference/configure.mdx @@ -189,7 +189,7 @@ A successful response confirms the privacy router can reach the configured backe Explore related topics: -- To understand the inference routing flow and supported API patterns, refer to [Index](/inference/about). +- To understand the inference routing flow and supported API patterns, refer to [About Inference Routing](/inference/about). - To follow a complete Ollama-based local setup, refer to [Inference Ollama](/get-started/tutorials/inference-ollama). - To follow a complete LM Studio-based local setup, refer to [Local Inference Lmstudio](/get-started/tutorials/local-inference-lmstudio). - To control external endpoints, refer to [Policies](/sandboxes/policies). diff --git a/docs/sandboxes/policies.mdx b/docs/sandboxes/policies.mdx index 14876331c..95732912f 100644 --- a/docs/sandboxes/policies.mdx +++ b/docs/sandboxes/policies.mdx @@ -589,6 +589,6 @@ GraphQL field names are application-specific, so treat these as starting shapes Explore related topics: -- To learn about network access rules and sandbox isolation layers, refer to [Index](/sandboxes/about). +- To learn about network access rules and sandbox isolation layers, refer to [About Gateways and Sandboxes](/sandboxes/about). - To view the full field-by-field YAML definition, refer to the [Policy Schema Reference](/reference/policy-schema). - To review the default policy breakdown, refer to [Default Policy](/reference/default-policy).