The Smallest Search Protocol
The Wikipedia corpus was embedded by Cohere embed-v3 (1024D). Your query is embedded by MiniLM-L6-v2 (384D). Two neural networks that have never met — different architectures, different training data, different dimensions — agreeing on what words mean, in 82 dimensions.
That's Managed Query. You don't run the index. You don't store 87GB of embeddings. You don't manage a GPU. You send 328 bytes and get answers from the entire English Wikipedia.