Generative AI with Local LLMs
  • The Book
  • Home
  • About
Sign in Subscribe

Protocol

A collection of 2 posts
Model Context Protocol (MCP): Connecting Local LLMs to Various Data Sources
MCP

Model Context Protocol (MCP): Connecting Local LLMs to Various Data Sources

Anthropic launched the Model Context Protocol (MCP) in November 2024 — an open standard for data exchange between LLMs and various data sources. The protocol provides a simplified way for LLMs to integrate with tools and services to perform tasks such as searching files on local systems, accessing GitHub repositories to
17 Mar 2025 6 min read
Minion(s): A Simple Protocol for Communicating with Local and Cloud LLMs
Minion

Minion(s): A Simple Protocol for Communicating with Local and Cloud LLMs

Recently, HazyResearch introduced a simple communication protocol for integrating local and cloud LLMs. The core idea behind the protocol is to maximize the use of local LLM models with local data, minimizing cloud API costs while maintaining high-quality outputs. The protocol comes in two flavors: * Minion: A local LLM model
16 Mar 2025 18 min read
Page 1 of 1
Generative AI with Local LLMs © 2025
  • Sign up
Powered by Ghost