Upload from any agent
Publish traces from 10+ supported agents directly in the CLI.
Share and collaborate on your coding agent sessions. Made for teams, and free to get started.
Download the CLI to get started
Works with your favourite agents


Start sharing traces from any agent, in minutes.
Publish traces from 10+ supported agents directly in the CLI.
Copy the link & share your trace with your team or the world.
UserMenu because it uses imperative WAAPI and can re-measure on state changes. I'm reading that implementation next and then I'll verify smoothing techniques from current guidance.Download a full trace or continue working on someone else's.
Download a trace, or continue it in Claude Code. The open agent menu lists OpenCode, Pi, Claude Code, Codex, Cursor, Gemini CLI, Amp, Cline, OpenClaw, GitHub Copilot, Hermes, with Claude Code selected.
Traces for Teams
Traces gives teams one place to see work in progress, share context on finished work, and see which agents people are using.
Create your Team

“Github increasingly doesn't feel like the best place to understand the work done on a codebase. Agent traces provide a much more human-readable overview. Just started using traces.com. Feels quite nice.”
Millin Gabani, CEO of Workers
See original postShare agent conversations without worrying about sensitive data.
Share your traces privately, directly, or publicly, so only the right people see them.
maujim · 102 messages
Diagnose and Enhance Voltage Error Reporting
maujim · 102 messages
Diagnose and Enhance Voltage Error Reporting
maujim · 102 messages
Diagnose and Enhance Voltage Error Reporting
Set team-level policies to control how your team can share traces.
Traces in Workers can be published as:
We automatically strip sensitive data like API keys, emails & database credentials from traces on publish.
Shared the rollout trace after scrubbing [REDACTED], [REDACTED], and [REDACTED] from the assistant reply before sending the link to the team.
The published run keeps the reasoning intact while replacing keys, customer emails, and database URLs with clear [REDACTED] markers anyone can spot immediately.
Reviewers still understand what happened, but the sensitive values stay hidden behind [REDACTED] in every shared view.
Start simple & integrate more as you go.
Publish & manage traces directly from your terminal.
Let your agent share traces as you work.
Publish traces via API with your own tools, schedulers & workflows.
Automatically share traces on every commit from your CI/CD pipelines.
Free for individuals & small teams, flexible at scale.
Core
Free
Custom
Get in touch
See what ships, loop in the right agents, and share traces with git hooks & custom skills.
Team analytics preview showing Claude Code, Codex, Cursor, Gemini CLI, and Amp as the top agents, with an average session length of 47 minutes and 82.0 percent AI output.
A GitHub-style pull request timeline shows Maya Chen committing a docs update, a Traces bot comment with pull request trace links, two preview deployments, and a pull request mention.
Traces found for this PR:
Team member list showing Maya Chen as admin, Theo Brooks as member, Alice as the invited agent, Lina Park as member, and Ari Singh as member.

A terminal conversation showing a request to share a trace, the Traces skill that ran, and the private share link confirmation.
Great, I added back the function.
Share this trace
Skill Traces
Ran traces share --trace-id w9g0svsjx839ngt --json
Shared your trace to https://traces.com/s/w9g0svsjx839ngt as private.
Supported agents include Claude Code, Cursor, OpenCode, Codex, Gemini CLI, Pi, Amp, Cline, OpenClaw, GitHub Copilot, Hermes.
Install the CLI to get started or make an account first.
Download the CLI to get started
or Create an account
Create your accountBrowse traces & learn how people are using agents. Contribute by sharing your own traces.
Browse public tracesThe user wanted to know if the policy document mentioned anything about open-source or openness in AI. The document did not include any references to open-source concepts but generally emphasized removing barriers to innovation and making federal datasets accessible. The user received confirmation that open-source AI development was not specifically addressed in the framework.
The user wanted to fix their Reachy Mini Mac app crashing due to motor detection issues. The assistant resolved the software crash related to Python but found the motor bus was unresponsive, indicating a hardware communication problem. Despite multiple scans and attempts to reflash the motor, no motors were detected, and the assistant provided a detailed support ticket for hardware-level troubleshooting.
The user wanted to find out if their conversation history is saved on their device. The assistant confirmed that the conversations are indeed stored locally and provided details about the number of sessions available. The user’s past conversation transcripts are accessible directly from their computer, not stored online.
The user wanted to know the most successful collaborations in Hugging Face's history. The assistant explained key partnerships, highlighting the BigScience project that involved a large international team creating a major multilingual language model. The user then asked for the command to run a related model, and the assistant provided instructions to launch and onboard the model server locally.
The user wanted a deep, fundamental understanding of a text rendering project by analyzing its entire codebase and history from first principles. The investigation revealed that the project is not a traditional text renderer but a system that separates text measurement from layout, enabling efficient and flexible text handling independent of browser layout. This approach unlocks advanced possibilities like complex text flow and composition without relying on browser quirks.
The user wanted to switch the AI model they were using to a specific version called Qwen/Qwen3. 5-9B on Hugging Face. The assistant updated the configuration to reflect this change and informed the user that they need to restart the session for it to take effect.
The user wanted to understand what Hugging Face is. The assistant explained that Hugging Face is a company and platform focused on machine learning and AI models, especially for language processing. It highlighted their popular tools and the large collection of shared models available to developers and researchers.
The user wanted to get details about a specific tweet discussing the need for a public, free repository of coding agent sessions. They then asked if their current project aligns with the tweet's request. The assistant confirmed that the project exactly matches the tweet's vision and later shared the conversation trace as requested.
The user wanted to know if the policy document mentioned anything about open-source or openness in AI. The document did not include any references to open-source concepts but generally emphasized removing barriers to innovation and making federal datasets accessible. The user received confirmation that open-source AI development was not specifically addressed in the framework.
The user wanted to fix their Reachy Mini Mac app crashing due to motor detection issues. The assistant resolved the software crash related to Python but found the motor bus was unresponsive, indicating a hardware communication problem. Despite multiple scans and attempts to reflash the motor, no motors were detected, and the assistant provided a detailed support ticket for hardware-level troubleshooting.
The user wanted to find out if their conversation history is saved on their device. The assistant confirmed that the conversations are indeed stored locally and provided details about the number of sessions available. The user’s past conversation transcripts are accessible directly from their computer, not stored online.
The user wanted to know the most successful collaborations in Hugging Face's history. The assistant explained key partnerships, highlighting the BigScience project that involved a large international team creating a major multilingual language model. The user then asked for the command to run a related model, and the assistant provided instructions to launch and onboard the model server locally.
The user wanted a deep, fundamental understanding of a text rendering project by analyzing its entire codebase and history from first principles. The investigation revealed that the project is not a traditional text renderer but a system that separates text measurement from layout, enabling efficient and flexible text handling independent of browser layout. This approach unlocks advanced possibilities like complex text flow and composition without relying on browser quirks.
The user wanted to switch the AI model they were using to a specific version called Qwen/Qwen3. 5-9B on Hugging Face. The assistant updated the configuration to reflect this change and informed the user that they need to restart the session for it to take effect.
The user wanted to understand what Hugging Face is. The assistant explained that Hugging Face is a company and platform focused on machine learning and AI models, especially for language processing. It highlighted their popular tools and the large collection of shared models available to developers and researchers.
The user wanted to get details about a specific tweet discussing the need for a public, free repository of coding agent sessions. They then asked if their current project aligns with the tweet's request. The assistant confirmed that the project exactly matches the tweet's vision and later shared the conversation trace as requested.