Friday, 10 April 2026

Azure MCP - Building AI-Powered Tools with Model Context Protocol

 Introduction to Model Context Protocol (MCP)

Artificial Intelligence is rapidly transforming how we build software. As LLMs (Large Language Models) become more capable, developers need a standardized way to connect AI models to real-world tools, APIs, and data sources. This is exactly the problem that the Model Context Protocol (MCP) solves. 


MCP is an open protocol introduced by Anthropic that defines a standard interface between AI models and external tools. Think of it as a USB standard, just as USB allows any device to connect to any computer, MCP allows any AI model to connect to any tool or data source that implements the protocol.


Why MCP Matters

Before MCP, integrating AI with external tools required custom, one-off implementations for every combination of model and tool. This created a fragmented ecosystem where:
        Each AI provider had its own proprietary function-calling format
        Tool implementations couldn't be reused across different AI models
        There was no standardized way to discover, authenticate, or invoke tools
        Developers had to rebuild integrations every time they switched AI providers
 
MCP solves all of this by providing a universal, open standard that any AI model and any tool can implement.


MCP Core Concepts


Concept

Description

Example

MCP Server

Exposes tools and resources via the MCP protocol

Weather API, Math Engine, Database

MCP Client

Connects to MCP servers and invokes tools

Azure OpenAI integration, Claude Desktop

Tool

A callable function exposed by the server

GetWeather(), Add(), SearchDB()

Transport

Communication channel between client and server

stdio, HTTP/SSE, WebSocket

Resource

Data exposed by the server for the AI context

Files, database records, API responses



Azure and MCP: The Perfect Combination

Microsoft Azure provides an ideal cloud platform for hosting and scaling MCP servers. With services like Azure Container Apps, Azure API Management, and Azure OpenAI, you can build enterprise-grade AI tool ecosystems that are secure, scalable, and manageable


Building an MCP Server in .NET

The MCP SDK for .NET makes it straightforward to build a production-ready MCP server. In this section, we walk through a complete implementation with math and weather tools.

Project Setup

Create a new ASP.NET Core Web API project and add the required NuGet packages:





Defining MCP Tools

Tools are defined as C# methods decorated with MCP attributes. The SDK automatically discovers and registers them:

namespace AzureMcpServerPOC
{
    [McpServerToolType]
    public static class MathTools
    {
        [McpServerTool, Description("Adds two numbers together")]
        public static double Add(
            [Description("First number")] double a,
            [Description("Second number")] double b)
        {
            return a + b;
        }

        [McpServerTool, Description("Multiplies two numbers")]
        public static double Multiply(
            [Description("First number")] double a,
            [Description("Second number")] double b)
        {
            return a * b;
        }


        [McpServerTool, Description("Calculates the square root of a number")]
        public static double SquareRoot(
            [Description("The number to find square root of")] double number)
        {
            if (number < 0)
                throw new ArgumentException("Cannot calculate square root");
            return Math.Sqrt(number);
        }
    }
}
Configuring HTTP Transport and update program.cs

builder.Services
    .AddMcpServer()
    .WithHttpTransport(options =>
    {
        options.Stateless = true; 
    })
    .WithToolsFromAssembly();

// Instead of WithToolsFromAssembly() you can also use .WithTools<MathTools>();

app.MapMcp("/mcp");

 Run MCPServer Locally in Postman

Run your project locally and check the port number. Use the same in Postman
























Multiply was a method name; you can run Add too. method name should be in lower case while calling.

{ "jsonrpc": "2.0", "id": 2, "method": "tools/call", "params":
{ "name": "multiply", "arguments": { "a": 4, "b":3 } } }

Choosing the Right Azure Hosting Option

Option

Best Scenario

Cost

Complexity

Scale

Azure Container Apps

Production, scalable MCP server

Pay-per-use

Low

Auto

Azure App Service

Simple hosting, familiar PaaS

Fixed tier

Very Low

Manual

Container Apps + APIM

Enterprise, multi-tenant, public API

Medium

Medium

Auto

AKS

Very large-scale, custom networking

Higher

High

Full control


Why is Azure OpenAI needed to call an MCP Server?

The short answer is: Azure OpenAI is the "brain",
MCP Server is the "hands".

The Core Problem MCP Solves

An MCP Server is just a collection of tools (functions).

It sits there waiting to be called.
It has no intelligence of its own. It cannot:

  • Understand natural language ("What's 144 squared?")
  • Decide which tool to call
  • Interpret what arguments to pass
  • Know when it has enough information to stop calling tools
This is where Azure OpenAI (GPT-4o) comes in.

You (natural language) ──► Azure OpenAI ──► "I need to call Add(a:10, b:20)" │ MCP Server executes it │ Returns result: 30 │ Azure OpenAI ◄── "The answer is 30" │ "10 plus 20 equals 30" ──► You




The Three Roles Explained

Azure OpenAI (The Decision Maker)

  • Reads the user's natural language prompt
  • Reviews the list of available MCP tools and their descriptions
  • Decides which tool(s) to call and with what arguments
  • Interprets the tool results and forms a natural language response

MCP Server (The Executor)

  • Holds the actual business logic (math, weather, database, etc.)
  • Executes tools when called with specific inputs
  • Returns raw results — no interpretation, no language, just data

MCP Client (The Coordinator)

  • Sits between Azure OpenAI and the MCP Server
  • Fetches the tool list and converts it to OpenAI's format
  • Passes tool calls from OpenAI to the MCP Server
  • Returns results back to OpenAI to continue the conversation

Without Azure OpenAI, you'd have to manually figure out which tool to call, format the exact JSON arguments yourself, and know when to stop, essentially doing the AI's job yourself. 


Summary

MCP represents a significant shift in how AI models interact with the world. Standardizing the interface between models and tools, it enables a new generation of AI applications that are modular, reusable, and interoperable.

 

In this article, we covered:
        The core concepts of MCP and why it matters for enterprise AI development
        Building a .NET MCP Server with math and weather tools
        Exposing the server via HTTP transport with API key authentication


Next, we will learn:
        Connecting Azure OpenAI to your hosted MCP server

Thursday, 2 April 2026

Building a Natural Language to SQL Query Using Azure OpenAI by using Entity Framework and .Net 8

Modern applications are rapidly evolving toward more intuitive user experiences. One powerful capability is allowing users to interact with systems using natural language instead of complex queries.

In this blog, we’ll build a backend API using .NET 8, Entity Framework Core, and Azure OpenAI that converts user prompts into SQL queries and executes them against a database.

Imagine asking:

“Get all customers who placed orders in the last 30 days.”
…and your system automatically converts that into SQL and returns results.

What is Azure OpenAI?

Azure OpenAI is a cloud service by Microsoft that provides access to powerful AI models like GPT in a secure and scalable environment.

It enables developers to:

Generate text and code
Build chat-based applications
Automate workflows
Create intelligent APIs (like our SQL generator)

Step 1: Create an Azure OpenAI Resource

In the Azure Portal:

  1. Click Create a resource
  2. Search for Azure OpenAI
  3. Fill in:
    • Resource group
    • Region (e.g., East US)
    • Name
  4. Click Create






















Step 2: Deploy a Model

After resource creation:

  1. Go to Model deployments
  2. Click Create
  3. Choose model:
    • gpt-4o (recommended)
  4. Set deployment name (e.g., gpt-4o)
  5. Deploy







































Step 3: Install Required Packages

dotnet add package OpenAI
dotnet add package Microsoft.EntityFrameworkCore
dotnet add package Microsoft.EntityFrameworkCore.SqlServer

Step 4: Configure Azure OpenAI

{
  "AzureOpenAI": {
    "Endpoint": "https://your-resource.openai.azure.com/",
    "ApiKey": "your-api-key",
    "DeploymentName": "gpt-4o"
  }
}

















Step 5: Create OpenAI Service

using OpenAI;
using OpenAI.Chat;
using Azure;

public class OpenAiService
{
    private readonly ChatClient _chatClient;

    public OpenAiService(IConfiguration config)
    {
        var client = new AzureOpenAIClient(
            new Uri(config["AzureOpenAI:Endpoint"]),
            new AzureKeyCredential(config["AzureOpenAI:ApiKey"])
        );

        _chatClient = client.GetChatClient(config["AzureOpenAI:DeploymentName"]);
    }

    public async Task<string> GenerateSqlAsync(string prompt, string schema)
    {
        var systemPrompt = $@"
                             You are a SQL generator.
                             Only generate SELECT queries.
                             Do not modify data.

                           Schema:
                            {schema}";

        var response = await _chatClient.CompleteChatAsync(
            new ChatMessage[]
            {
                new SystemChatMessage(systemPrompt),
                new UserChatMessage(prompt)
            });

        return response.Value.Content[0].Text;
    }
}

Step 6: Provide Database Schema

AI needs context to generate accurate SQL:

var schema = @"
Table: Customers(Id, Name, Email)
Table: Orders(Id, CustomerId, OrderDate, TotalAmount)
";

Step 7: Validate SQL (Critical Step)

Never execute AI-generated SQL directly.

public bool IsSafeQuery(string sql)
{
    var forbidden = new[] { "INSERT", "UPDATE", "DELETE", "DROP", "ALTER" };
    return !forbidden.Any(f => sql.ToUpper().Contains(f));
}

Step 8: Execute Query

var result = await context
    .Database
    .SqlQueryRaw<dynamic>(sql)
    .ToListAsync();


Step 9: Build API Endpoint

[HttpPost]
public async Task<IActionResult> ExecuteQuery([FromBody] string prompt)
{
   // var schema = "..."; // you can provide schema here or 
// or
    var schema = GetSchemaFromDb(); // Provide all the tables var sql = await _aiService.GenerateSqlAsync(prompt, schema); if (!_queryService.IsSafeQuery(sql)) return BadRequest("Unsafe query"); var result = await _queryService.ExecuteQueryAsync(sql); return Ok(new { sql, result }); }

private Dictionary<string, List<string>> GetSchemaFromDb() { var schema = new Dictionary<string, List<string>>(); var entityTypes = _appDbContext.Model.GetEntityTypes(); foreach (var entity in entityTypes) { var tableName = entity.GetTableName(); tableName = tableName ?? string.Empty; var columns = entity .GetProperties() .Select(p => p.Name) .ToList(); schema[tableName] = columns; } return schema; }

Conclusion

By combining .NET 8, Entity Framework Core, and Azure OpenAI, you can build intelligent

APIs that translate human language into actionable database queries.

However, with great power comes responsibility—always validate and

secure AI-generated outputs before execution.

Tuesday, 10 March 2026

Push Docker Image to Create Azure Container App

Push Docker Image either to Azure Container Registry (ACR), or you can directly upload Public Image of Docker Hub

Step 1: Create Azure Container App

  1. In Azure Portal → Create a Resource → Container Apps.
  2. Fill in: Name, Resource Group, Region
  3. Environment: Create or select an existing one
  4. Under Container, select Single container → Azure Container Registry/Docker Image.
  5. Choose your registry, repository, and tag (e.g., latest).
  6. Set CPU/Memory limits (e.g., 0.5 CPU, 1 GB RAM).
  7. Configure Ingress:
  8. Enable External if you want public access.
  9. Set port (e.g., 8000) to match your Dockerfile.
  10. Review + Create

You can change the same settings in the ingress as well.



You can update your image type and select "Image and Tag" and specify your image here.

















You can update your variable under the container.













Click on the overview you can find the application URL








Monday, 9 March 2026

404 when you refresh React Page on Static Web App

This happens because React uses client-side routing, while Azure Static Web Apps serves static files. When you refresh /login, Azure tries to find a physical /login file, which doesn’t exist, so it returns 404.

This is common with apps using React Router.


Step 1: Add a staticwebapp.config.json file to your project

You need to tell Azure to redirect all routes to index.html so React can handle routing.

Create this file in your build output folder or project root:

staticwebapp.config.json

Step 2: Add this configuration


{
  "navigationFallback": {
    "rewrite": "/index.html",
    "exclude": ["/images/*", "/css/*", "/js/*", "/assets/*"]
  }
}

or 

{
  "navigationFallback": {
    "rewrite": "/index.html"
  }
}

Then rebuild and deploy again.

swa deploy ./dist --deployment-token YourTokenHere --env production

After Deployment

Now these will work on refresh:

/login

/dashboard

/profile

/settings

Deploy Angular/React Application to Azure Static Web App from Visual Studio Code

Step 1: Build your React project

Open the terminal in VS Code and run:

npm install
npm run build

Step 2: Get Deployment Token

  • Go to Microsoft Azure Portal.
  • Open your Azure Static Web App
  • Click Deployment Token
  • Copy the token

Step 3: Install Azure Static Web Apps CLI

Install the CLI tool:

npm install -g @azure/static-web-apps-cli

This installs SWA CLI.

Step 3: Deploy the build folder

Run this command inside your project folder.

swa deploy ./dist --deployment-token ABC123XYZ

Your URL changed because the deployment was created as a preview environment instead of production.

 If you want to deploy on a real URL, then run the command.

swa deploy ./dist --deployment-token TokenHere --env production

Your site will be available at:

https://your-app-name.azurestaticapps.net

Wednesday, 31 December 2025

Azure Function - In-process model vs Isolated worker model

 In-process model

  • Your function code runs inside the same process as the Azure Functions runtime
  • Uses the WebJobs SDK.

[FunctionName("TimetriggerFunction")]

What this means

  • Used in:

    • Azure Functions v1–v4 (in-process)

  • Namespace: Microsoft.Azure.WebJobs

  • Runs inside the same process as the Azure Functions runtime

  • Uses:

    • ILogger for logging

    • TimerInfo from Microsoft.Azure.WebJobs

Characteristics

  • Tight coupling to the Functions runtime

  • Faster startup (historically)

  • More magic / implicit behavior

  • Not recommended for new apps going forward


2. Isolated worker model

  • Your function runs in a separate .NET process from the Azure Functions runtime.
  • Communicates with the runtime over gRPC.

[Function("TimetriggerFunction")]

What this means

  • Used in:

    • Azure Functions v4+ isolated worker

  • Namespace: Microsoft.Azure.Functions.Worker

  • Runs in a separate .NET process from the runtime

  • Uses:

    • FunctionContext instead of ILogger

    • Logging via context.GetLogger(...)


Characteristics

  • Decoupled from the Functions runtime

  • Better:

    • Dependency injection

    • Versioning control

    • Middleware support

  • Required for:

    • .NET 8

    • Future Azure Functions development

Key Differences at a Glance

Feature

[FunctionName]

[Function]

Hosting model

In-process

Isolated worker

Runtime coupling

Tight

Loose

Logging

ILogger

FunctionContext

Namespace

WebJobs

Functions.Worker

.NET 8 support

No

Yes

Recommended for new apps

No

Yes


Summary

In-process = older, tightly coupled, simpler
Isolated worker = modern, decoupled, flexible, future-proof

Wednesday, 2 July 2025

How to get PowerBI Embed Token

 Here's a step-by-step guide to help you through the process.

Step 1: Register Your App in Azure

  • 1. Go to Azure Portal → App registrations
  • 2. Register a new app.
  • 3. Provide necessary API permissions
  • 4. Go to Certificates & secrets → Generate a Client Secret






 




Step 2: Assign the App to the Power BI Workspace

1. Go to Power BI Service
2. Click Workspace access
3. Add the App's Service Principal(App Registration ClientId) as an Admin or Member














Step 3: Get Access Token 

Request:

 Method: POST
 URL: https://login.microsoftonline.com/{tenant-id}/oauth2/v2.0/token

 

Headers:
Content-Type: application/x-www-form-urlencoded 

Body: (x-www-form-urlencoded):

grant_type=client_credentials
client_id={your-client-id}
client_secret={your-client-secret}
scope=https://analysis.windows.net/powerbi/api/.default
















Endpoint will provide you with an Access Token












Step 4: Generate the Embed Token


URL: Post Endpoint 

https://api.powerbi.com/v1.0/myorg/groups/{groupId}/reports/{reportId}/GenerateToken


Headers

Authorization: Bearer <access_token>
Content-Type: application/json

Body

{
  "accessLevel": "View"
}









 

Sample Code to embed the PowerBI Report in an HTML Page



<script src="https://cdn.jsdelivr.net/npm/powerbi-client@2.21.1/dist/powerbi.min.js"></script>

<style>
    #reportContainer {
      width: 100%;
      height: 800px;
      border: 1px solid #ccc;
    }
  </style>

<div id="reportContainer"></div>

<script>
  
    const embedToken = "Call the APIs and pass the token here";
    
    const embedUrl = "https://app.powerbi.com/reportEmbed?reportId=report id here&groupId=group id here";

    const reportId = "your report Id here";

    // Embed configuration
    const config = {
        type: 'report',
        id: reportId,
        embedUrl: embedUrl,
        accessToken: embedToken,
        tokenType: window['powerbi-client'].models.TokenType.Embed,
        settings: {
            filterPaneEnabled: false,
            navContentPaneEnabled: true
        }
    };

    const reportContainer = document.getElementById('reportContainer');

    // Embed the report
    const powerbi = new window['powerbi-client'].service.Service(
        window['powerbi-client'].factories.hpmFactory,
        window['powerbi-client'].factories.wpmpFactory,
        window['powerbi-client'].factories.routerFactory
    );

    powerbi.embed(reportContainer, config);
</script>


Azure MCP - Building AI-Powered Tools with Model Context Protocol

  Introduction to Model Context Protocol (MCP) Artificial Intelligence is rapidly transforming how we build software. As LLMs (Large Languag...