graphiti-llm-config-drift-openai-proxy-not-gemini

Memory docs stated Graphiti used Gemini 3 Pro as its LLM backend, but live docker exec inspection revealed it was actually configured with OpenAI + claude-haiku-4-5 via an internal proxy. This is a canonical example of why live state verification (Law #4) is mandatory — documented config and actual running config had diverged. The doc was corrected to reflect OpenAI-via-proxy as the actual configuration.