Pydantic v2 shipped in June 2023 with a new core — same Python API, but the validation engine underneath was rewritten entirely in Rust. The result: 17x faster validation on the same data models, with zero changes to user code. Nobody had to learn Rust. Nobody had to change their imports. They ran pip install --upgrade pydantic and their app got 17x faster.
That's the pattern. And in 2026, it's everywhere.
The Inventory
Here's a partial list of Python tools that are now Rust under the hood:
| Tool | What It Does | What It Replaced | Speedup |
|---|
| uv | Package management | pip, Poetry, virtualenv, pyenv | 10-100x |
| ruff | Linting + formatting | flake8, black, isort | 10-100x |
| ty | Type checking | mypy, pyright | Early benchmarks show 10x+ |
| Polars | DataFrames | pandas | 5-9x |
| Pydantic v2 | Data validation | Pydantic v1 (Python) | 17x |
| orjson | JSON serialization | stdlib json | 10x dumps, 2x loads |
| tokenizers | NLP tokenization | Python tokenizers | 10-100x |
| tiktoken | BPE tokenization | Python BPE | Significantly faster |
| cryptography | Crypto primitives | Pure Python crypto | Memory-safe + fast |
| Granian | HTTP server | Gunicorn, Uvicorn | 2x throughput |
| Robyn | Web framework | Flask, FastAPI | Multi-core by default |
| candle | ML inference | PyTorch (inference) | Minimal binary, fast boot |
| mistral.rs | LLM inference | vLLM, text-gen | Pure Rust, flexible |
That's 13 tools across the Python ecosystem — package management, code quality, data processing, web serving, AI inference, and cryptography. Every single one written in Rust. Every single one faster than what it replaced.
Why Rust? Why Not C, C++, Go, or Zig?
This is the question I keep hearing. Python extensions have been written in C forever — NumPy, CPython itself, everything in the scientific stack. Why is Rust specifically winning?
Memory Safety Without a GC
C extensions are fast but dangerous. A buffer overflow in a C extension can corrupt Python's memory, crash the interpreter, or worse — create security vulnerabilities. The cryptography library switched its backend from C to Rust specifically because memory safety matters for crypto code. One use-after-free bug in a crypto library can compromise every application that uses it.
Rust's ownership system prevents these bugs at compile time. No garbage collector overhead (unlike Go or Java). No manual memory management (unlike C or C++). You get C-level speed with compile-time safety guarantees.
PyO3 Made It Trivial
PyO3 is the Rust library that provides bindings to the Python interpreter. Combined with maturin, a build tool for Rust-based Python packages, you can write Rust code and expose it to Python with minimal boilerplate.
Here's what a simple Rust-Python extension looks like:
# This is what the Python user sees
import my_fast_lib
result = my_fast_lib.process(data)
# They never know it's Rust underneath
The Rust side uses #[pyfunction] and #[pymodule] macros from PyO3. Maturin handles compilation, wheel building, and publishing to PyPI. The end user runs pip install my_fast_lib and gets a binary wheel — no Rust compiler needed on their machine.
Three things make PyO3 the right choice in 2026: Rust's memory safety eliminates whole categories of bugs that plague Cython and C extensions, py.allow_threads() integrates naturally with Python 3.14's free-threaded build, and maturin makes distribution trivial.
True Parallelism
This one matters more in 2026 than ever. With Python 3.14's free-threaded build officially supported, Rust extensions can release the GIL and run truly parallel code across all CPU cores. PyO3's py.allow_threads() makes this a one-line operation.
Polars already does this — every operation parallelizes across cores by default. Ruff does it too — linting your codebase uses all available threads. The combination of Rust's fearless concurrency and Python's free-threaded mode means Rust extensions can finally use every core on your machine without the GIL getting in the way.
Go Doesn't Fit
Go has a garbage collector, which creates pauses and unpredictable latency — fine for servers, bad for tight Python extension loops. Go's CGo bridge for calling C (and by extension, Python) has significant overhead. There's no PyO3 equivalent for Go that makes Python bindings ergonomic.
Zig Is Promising But Early
Zig is a compelling language for this use case — it compiles to C ABI, has no hidden allocations, and is designed for interop. But its ecosystem in 2026 is still early. No production-quality Python binding layer exists yet. Rust has PyO3, maturin, thousands of production deployments, and a mature standard library. Zig might be the future. Rust is the present.
The Pattern: 95/5 Rewrite
Here's what most "Rust vs Python" articles get wrong: nobody is rewriting entire Python applications in Rust. The pattern is surgical.
In a typical Python application, about 5% of the code is responsible for 95% of the execution time — the inner loops, the parsers, the serializers, the compute kernels. The Rust-Python pattern is:
- Profile to find the hot path
- Rewrite that 5% in Rust with PyO3
- Keep the other 95% in Python
- Ship as a normal
pip install package
This is exactly what every tool in the inventory table did. Pydantic didn't rewrite its Python API — it rewrote the validation engine underneath. orjson didn't replace Python's json module interface — it reimplemented the serialization and parsing logic. Ruff didn't create a new linting paradigm — it reimplemented flake8's rules in a faster engine.
The Python API stays. The Python ecosystem stays. The Python developer experience stays. Only the hot path moves to Rust.
This is fundamentally different from the "rewrite it in Rust" meme. Nobody is telling you to rewrite your Django app. They're saying: the tools your Django app depends on — the package manager, the linter, the JSON parser, the data validation layer — those are better in Rust. And you don't even need to know.
The AI Inference Layer
The newest frontier for Rust-in-Python is AI inference.
Hugging Face's candle is a minimalist ML framework written in Rust. It runs Transformer models, Whisper, LLaMA, and Mistral with binaries that are tiny and boot instantly. This matters for serverless deployments — a candle binary cold-starts in milliseconds where PyTorch takes seconds.
mistral.rs is a Rust-native LLM inference engine built on candle, supporting quantized models (GGUF, GPTQ, AWQ), vision models, and multiple model architectures. It provides both Rust and Python SDKs — so Python developers can use it without touching Rust.
OpenAI's tiktoken — the BPE tokenizer used in all GPT models — is written in Rust with Python bindings. Hugging Face's tokenizers library, used by every Transformers model, is the same — Rust core, Python interface.
The pattern extends to the serving layer. Granian, a Rust HTTP server for Python WSGI/ASGI apps, delivers 2x the throughput of Gunicorn and handles thousands of simultaneous requests across multiple cores. For AI API servers that need to handle high concurrency, this is the difference between needing 4 servers and needing 2.
The Astral Story (And What OpenAI's Acquisition Means)
No discussion of Rust-in-Python is complete without Astral.
Charlie Marsh founded Astral and released ruff in 2022. It became the fastest-growing Python tool in history. Then came uv in 2024 — and it grew even faster. By early 2026, uv had over 126 million monthly PyPI downloads and 82,600+ GitHub stars. Astral then announced ty, a Rust-based Python type checker.
Three tools. All Rust. All 10-100x faster than what they replaced. All configured through pyproject.toml. All installed via pip.
On March 19, 2026, OpenAI announced it would acquire Astral. The team joins OpenAI's Codex division. The reason? Every Codex session requires installing Python dependencies and running code. By replacing pip with uv, Codex saves approximately 1 million minutes of compute time every week.
The community reaction was overwhelmingly anxious. The Hacker News thread hit 757 points and 475 comments. The worry: OpenAI now controls the package manager and linter that most of the Python ecosystem depends on.
The tools are MIT-licensed. They can be forked. But forking Rust code requires Rust engineers — and that's the structural problem.
The Contribution Problem
This is the legitimate criticism of the Rust-in-Python trend, and I don't think proponents take it seriously enough.
Python's strength has always been accessibility. A data scientist can read CPython source code. They can contribute to pandas. They can write a flake8 plugin. The ecosystem is self-sustaining because the people who use the tools can also improve them.
When the tools move to Rust, that feedback loop breaks. The Python community is effectively locked out of contributing to the tools they depend on most. Filing issues? Yes. Suggesting features? Sure. But writing code? You need to know Rust.
Look at the numbers: the 2025 Stack Overflow survey shows Rust at 72% admiration — the most admired language for the tenth consecutive year. But admiration doesn't equal usage. Python has millions more active developers than Rust. The pool of people who can maintain a Python tool written in Rust is a tiny fraction of the people who use that tool.
This creates a bus-factor problem. If the small team maintaining a critical Rust-based Python tool burns out, gets acqui-hired, or pivots (sound familiar, Astral?), who picks it up? Not the Python community — they can't read the code.
Where Rust Doesn't Make Sense
Not everything should be rewritten in Rust. The 95/5 rule applies in reverse too — if a tool isn't in the hot path, Rust adds complexity without meaningful benefit.
| Category | Rust Makes Sense | Rust Doesn't Make Sense |
|---|
| Parsers and serializers | JSON, YAML, TOML, tokenization | Config file parsing (runs once) |
| Package management | Dependency resolution, downloads | Package metadata (rarely bottleneck) |
| Linting/formatting | AST traversal, rule matching | Rule definitions (logic, not speed) |
| Data processing | Inner loops, aggregations | Data cleaning scripts (one-off) |
| Web servers | HTTP handling, connection management | Route handlers (business logic) |
| AI inference | Model execution, tensor ops | Model config, prompt management |
| Cryptography | Hash functions, encryption | Key management APIs |
| CLI tools | File watching, process management | Argument parsing (runs once) |
The pattern is consistent: Rust wins when the operation runs millions of times per second or processes large amounts of data. It doesn't win when the bottleneck is I/O, network calls, or human-readable business logic.
Django should not be rewritten in Rust. Your Flask API handlers should not be rewritten in Rust. Your data pipeline orchestration logic should not be rewritten in Rust. But the JSON parser your Flask API uses? The DataFrame library your pipeline depends on? The tokenizer your ML model calls? Those should absolutely be Rust.
How to Use Rust-Powered Python (Without Learning Rust)
Here's the practical guide. You don't need to learn Rust to benefit from this trend.
# Replace pip with uv
curl -LsSf https://astral.sh/uv/install.sh | sh
# Replace flake8 + black + isort with ruff
uv add --dev ruff
# You're already using Rust-powered tools
Step 2: Swap the Hot-Path Libraries
# Instead of json
import orjson # 10x faster dumps
# Instead of pandas (for large data)
import polars as pl # 5-9x faster, parallel by default
# Pydantic v2 — already Rust if you're using it
from pydantic import BaseModel # 17x faster validation
Step 3: Consider Rust-Powered Serving
# Instead of gunicorn
pip install granian
granian --interface asgi myapp:app # 2x throughput
Step 4: Evaluate for AI Workloads
# For LLM tokenization — already Rust
pip install tiktoken # OpenAI tokenizer
pip install tokenizers # Hugging Face tokenizer
# For LLM inference — if you need fast local inference
pip install mistralrs # Rust-native, Python SDK
None of these steps require writing Rust. You're installing Python packages. The Rust is hidden behind pip install.
What I Actually Think
Rust eating Python's tooling is the best thing that's happened to Python in a decade. And I say that as someone who can barely write Rust.
Here's the thing: Python's value was never performance. Python's value is expressiveness, ecosystem, and accessibility. The language is glue — it connects ideas, libraries, and systems with minimal friction. The worst parts of Python were always the parts where you needed it to be fast: serialization, validation, data processing, package resolution.
Rust is filling exactly those gaps. Not by replacing Python the language, but by replacing the slow parts of the Python ecosystem. The result is that Python programs get faster without Python programs changing. That's the dream.
I'm less sanguine about the concentration risk. Astral's three tools (uv, ruff, ty) handle package management, code quality, and type checking — three of the four most important parts of the Python development workflow. Those tools now belong to OpenAI. Meanwhile, Polars is a VC-backed company with $25 million in funding. The open source safety net exists (MIT licenses), but the practical ability to fork is limited by the Rust expertise requirement.
My prediction: by 2028, the default Python experience will be Rust under the hood everywhere performance matters. uv for packages. ruff for quality. ty for types. Polars for data. orjson for serialization. Granian or similar for serving. The Python developer won't need to know any of this — they'll just notice things are faster.
The contribution barrier is real and concerning. But the performance benefits are undeniable. And honestly? If the choice is between slow tools that the community can contribute to and fast tools that the community can't contribute to, the market has already decided. Speed wins. The community adapted to NumPy being C code. They adapted to PyTorch being C++ code. They'll adapt to ruff being Rust code.
Rust isn't replacing Python. Rust is making Python better. Those are very different things.
Sources
- PyO3 v0.28 and maturin: Writing Python Extensions in Rust — Nandann
- PyO3/pyo3 — GitHub
- astral-sh/uv — GitHub
- astral-sh/ruff — GitHub
- astral-sh/ty — GitHub
- Ruff documentation
- Astral Python 2026: Ruff, uv, ty — w3resource
- Astral to join OpenAI
- OpenAI to acquire Astral
- Simon Willison — Thoughts on OpenAI acquiring Astral
- pola-rs/polars — GitHub
- DuckDB vs Polars vs Pandas: Benchmark — codecentric
- Python in 2026: Why I Replaced pip with uv — DEV Community
- orjson — GitHub
- Want 500% Faster JSON in Python? Try orjson — Medium
- Granian: Rust-Powered Python HTTP Server — Substack
- huggingface/candle — GitHub
- EricLBuehler/mistral.rs — GitHub
- 2025 Stack Overflow Developer Survey — Technology
- A year of uv: pros, cons, and should you migrate — Bite Code
- Polars Series A: $21M from Accel — TechCrunch
- Rust: Python's New Performance Engine — The New Stack
- Why Rust Is Winning for AI Tooling in 2026 — dasroot.net
- awesome-python-rs — GitHub
- Python 3.14 Free-Threading — Official Documentation