Dev Proxy tracks LLM use, estimates costs
Dev Proxy v0.28 introduces telemetry features, enabling developers to track LLM usage and estimate costs for OpenAI and Azure OpenAI requests. This allows for better monitoring of AI-related expenses. The new OpenAITelemetryPlugin in Dev Proxy v0.28 monitors model usage, token counts, and cost estimates. It supports integration with Microsoft's Foundry Local, allowing developers to use local AI models for cost savings. This update also includes improvements for .NET Aspire users, expanded OpenAI payload support, and enhanced TypeSpec generation. Additionally, the Dev Proxy Toolkit for Visual Studio Code has been updated with new features.