Model context protocol integration with microsoft semantic kernel
The Model Context Protocol (MCP) aims to standardize connections between AI systems and data sources. This post demonstrates integrating mcp-playwright with Semantic Kernel and phi4-mini (via Ollama) for browser automation. Setting up the Playwright MCP Server Install the MCP Playwright package: npm install @playwright/mcp Add a script to package.json: { "scripts": { "server": "npx @playwright/mcp --port 8931" } } Start the server: npm run server This will launch the Playwright MCP server, displaying the port and endpoints in the console. Running phi4-mini with Ollama for Function Calling For reliable function calling, phi4-mini:latest (as of March 27, 2025) requires a custom Modelfile. Create a custom Modelfile: (See example) Create the model in Ollama: ollama create phi4-mini:latest -f Implementing the MCP Client in Semantic Kernel Install the MCP client NuGet package: dotnet add package ModelContextProtocol --prerelease Connect to the Playwright MCP server and retrieve tools: var mcpClient = await McpClientFactory.CreateAsync( new McpServerConfig { Id = "playwright", Name = "Playwright", TransportType = TransportTypes.Sse, Location = "http://localhost:8931" }); var tools = await mcpClient.ListToolsAsync(); Configure Semantic Kernel with the MCP tools: var kernelBuilder = Kernel.CreateBuilder(); kernelBuilder.AddOllamaChatCompletion(modelId: "phi4-mini"); kernelBuilder.Plugins.AddFromFunctions( pluginName: "playwright", functions: tools.Select(x => x.AsKernelFunction())); var kernel = kernelBuilder.Build(); var executionSettings = new PromptExecutionSettings { FunctionChoiceBehavior = FunctionChoiceBehavior.Auto( options: new() { RetainArgumentTypes = true }), ExtensionData = new Dictionary { { "temperature", 0 } } }; var result = await kernel.InvokePromptAsync( "open browser and navigate to https://www.google.com", new KernelArguments(executionSettings)); This code snippet connects to the MCP server, retrieves available tools, and integrates them into Semantic Kernel as functions. The prompt instructs the model to open a browser and navigate to Google, demonstrating the integration. Complete sample code Please feel free to reach out on twitter @roamingcode

The Model Context Protocol (MCP) aims to standardize connections between AI systems and data sources. This post demonstrates integrating mcp-playwright with Semantic Kernel and phi4-mini (via Ollama) for browser automation.
Setting up the Playwright MCP Server
-
Install the MCP Playwright package:
npm install @playwright/mcp
-
Add a script to
package.json
:
{ "scripts": { "server": "npx @playwright/mcp --port 8931" } }
-
Start the server:
npm run server
This will launch the Playwright MCP server, displaying the port and endpoints in the console.
Running phi4-mini with Ollama for Function Calling
For reliable function calling, phi4-mini:latest (as of March 27, 2025) requires a custom Modelfile.
Create a custom Modelfile: (See example)
-
Create the model in Ollama:
ollama create phi4-mini:latest -f
Implementing the MCP Client in Semantic Kernel
-
Install the MCP client NuGet package:
dotnet add package ModelContextProtocol --prerelease
-
Connect to the Playwright MCP server and retrieve tools:
var mcpClient = await McpClientFactory.CreateAsync( new McpServerConfig { Id = "playwright", Name = "Playwright", TransportType = TransportTypes.Sse, Location = "http://localhost:8931" }); var tools = await mcpClient.ListToolsAsync();
-
Configure Semantic Kernel with the MCP tools:
var kernelBuilder = Kernel.CreateBuilder(); kernelBuilder.AddOllamaChatCompletion(modelId: "phi4-mini"); kernelBuilder.Plugins.AddFromFunctions( pluginName: "playwright", functions: tools.Select(x => x.AsKernelFunction())); var kernel = kernelBuilder.Build(); var executionSettings = new PromptExecutionSettings { FunctionChoiceBehavior = FunctionChoiceBehavior.Auto( options: new() { RetainArgumentTypes = true }), ExtensionData = new Dictionary<string, object> { { "temperature", 0 } } }; var result = await kernel.InvokePromptAsync( "open browser and navigate to https://www.google.com", new KernelArguments(executionSettings));
This code snippet connects to the MCP server, retrieves available tools, and integrates them into Semantic Kernel as functions. The prompt instructs the model to open a browser and navigate to Google, demonstrating the integration.
Please feel free to reach out on twitter @roamingcode