Skip to content

Built with

Every piece of mcnoaa-tides stands on the shoulders of open source projects built by people who decided to share their work. This page is a thank-you to all of them.

NOAA CO-OPS

The National Oceanic and Atmospheric Administration runs the Center for Operational Oceanographic Products and Services — about 300 tide stations along every U.S. coast, measuring water levels every six minutes. Wind, pressure, temperature too. All of it published as free, public data through a REST API.

Without NOAA, there’s no mcnoaa-tides. Full stop.

These are the projects that make the MCP server run — the Python code that talks to NOAA’s API and translates it into something your assistant can use.

Model Context Protocol

MCP is the open standard that lets assistants use tools, read resources, and follow prompts from external servers. Think of it as a USB port for assistants — plug in a server, and the assistant gains new capabilities.

mcnoaa-tides is an MCP server. That’s how your assistant knows how to find tide stations, pull predictions, and generate charts.

FastMCP

FastMCP is the Python framework that makes building MCP servers feel like writing a regular Python app. Define a function, add type hints, and FastMCP handles the JSON-RPC transport, parameter validation, and error handling.

It’s the reason mcnoaa-tides can declare 14 tools, 4 prompts, and 3 resources in clean, readable code.

Python

Python — the language everything is written in. Specifically Python 3.12+, which brought nicer type syntax and faster startup times.

Pydantic

Pydantic validates data using Python type annotations. Every station record, tide prediction, and weather observation passes through a Pydantic model before it reaches your assistant. If the data is wrong, you find out immediately — not three tool calls later.

httpx

httpx is an async HTTP client for Python. It’s what actually makes the network requests to NOAA’s API endpoints — fetching tide predictions, water levels, and meteorological data. Fast, well-tested, and supports connection pooling out of the box.

When you ask for a tide chart or conditions dashboard, one of these two libraries does the rendering. They’re optional — the server works fine without them, but the visuals are worth installing.

Matplotlib

Matplotlib renders the PNG charts — the ones that show up inline in your conversation. It draws the tide curves, marks the highs and lows, overlays observed water levels. The classic scientific plotting library, used everywhere from research papers to satellite missions.

Plotly

Plotly generates interactive HTML charts — the ones you can pan, zoom, and hover over for exact values. Saves as a standalone .html file you can open in any browser. Great for detailed analysis or sharing with someone who isn’t in a chat window.

You’re reading this on a site built with these tools. The same maritime-teal theme, the animated wave hero, the conversation-style guides — it’s all thanks to this stack.

Astro

Astro is the web framework that builds this entire documentation site. It ships zero JavaScript by default — pages are static HTML until you need interactivity. Fast to load, fast to build, and designed for content-heavy sites like this one.

Starlight

Starlight is Astro’s documentation theme. Sidebar navigation, search, dark/light mode, responsive layout, component library — Starlight provides all of it. The Diátaxis structure you see (Getting Started, How-To, Reference, Understanding) comes from Starlight’s conventions.

Lucide

Lucide is the open-source icon library used throughout the site. Every little icon you see in cards, navigation, and badges is a Lucide SVG — lightweight, consistent, and beautifully drawn. Community-maintained fork of Feather Icons with 1,500+ icons.

Inter & JetBrains Mono

Inter is the sans-serif typeface for body text — designed for screens, optimized for readability at small sizes. JetBrains Mono is the monospace font for code blocks — ligatures, clear symbol distinction, and easy on the eyes during long reading sessions. Both served locally via Fontsource.

The infrastructure that builds, packages, and serves everything.

Docker

Docker packages the server and docs site into containers — self-contained environments that run the same way everywhere. The production docs site is a multi-stage build: Node.js compiles the static HTML, then Caddy serves it. No Node.js in production, just a tiny Alpine-based web server.

Caddy

Caddy is the web server that serves this site. It handles HTTPS automatically — gets certificates from Let’s Encrypt, renews them, and redirects HTTP to HTTPS. Zero configuration for TLS. The production Caddyfile is 9 lines long.

uv

uv is a fast Python package manager from Astral. It installs dependencies, resolves versions, and runs scripts — like pip but dramatically faster. The uvx mcnoaa-tides command that launches the server without a permanent install? That’s uv.

PyPI

PyPI — the Python Package Index — is where mcnoaa-tides gets published. When you run uvx mcnoaa-tides or pip install mcnoaa-tides, PyPI is the registry that serves the package.

Not visible in the final product, but essential for keeping the code clean and the tests passing.

Ruff

Ruff is an extremely fast Python linter and formatter — also from Astral. It replaces flake8, isort, black, and a dozen other tools with one binary that runs in milliseconds. Keeps the codebase consistent without slowing anyone down.

pytest

pytest runs the test suite. Combined with pytest-asyncio for testing the async MCP tools, it validates that station discovery works, tide predictions parse correctly, and the visualization pipeline doesn’t break.

Git

Git — version control. Hosted on a Gitea instance. Every change tracked, every deployment traceable.


Open source isn’t just code. It’s people choosing to share solutions instead of hoarding them. Every project listed here represents thousands of hours of work by people who decided the world would be better if others could build on what they made.

If you use mcnoaa-tides and find it useful, consider contributing back to any of these projects — even a bug report, a documentation fix, or a thank-you in their issue tracker goes a long way.