JetBrains has introduced Tracy, an AI tracing library for the Kotlin and Java languages.
Announced March 11 and accessible from GitHub, Tracy helps developers trace, monitor, and evaluate AI-powered features directly from their Kotlin or Java projects, JetBrains said. The open-source Kotlin library provides a unified API to capture structured traces and helps developers debug failures, measure execution time, and track large language model (LLM) usage across model calls, tool calls, and custom application logic.
Tracy implements the OpenTelemetry Generative AI Semantic Conventions for span attributes and event naming, thus ensuring traces remain compatible with any OpenTelemetry-compliant back end. JetBrains noted the following specific uses for Tracy:
- Tracing AI clients to capture messages, cost, token usage, and execution time.
- Tracing any function to record inputs, outputs, and execution duration.
- Creating and managing spans manually.
- Exporting traces to supported back ends (currently Langfuse and Weave).
Licensed under the Apache 2.0 License, Tracy is compatible with Kotlin from version 2.0.0 and Java from version 17. Integrations can be made with SDKs from OpenAI, Anthropic, and Gemini. The library also works with common Kotlin/LLM stacks including OkHttp and Ktor clients, as well as OpenAI, Anthropic, and Gemini ones, JetBrains said.
Go to Source
Author: