Breaking it down into concrete work items:
CLI changes (needed first)
traces active --dir <path> command -- query SQLite for traces whose directory matches the given path, return external IDs
traces sync <externalId> command -- read trace from agent datastore on disk, upload to API, return quickly (already partially exists as the upload flow, but needs to work as a standalone command)
traces import <file|url> command -- download a trace by external ID or from a JSON file, store in local SQLite
- Git-linked trace discovery on startup -- if in a git repo, read
refs/notes/traces from recent refs, fetch any unknown trace IDs from the API, show them in the list
API changes
5. GET /v1/traces/:externalId/export -- return full trace as JSON
6. GET /v1/traces/:externalId/export?format=md -- return trace as markdown
7. Visibility: make sure the existing GET /v1/traces/:externalId works for traces the requester didn't create (respecting visibility rules) -- may already work, needs checking
Git hook
8. Post-commit hook script -- calls traces active --dir ., writes note, kicks off background sync
9. Post-push hook script -- pushes refs/notes/traces to remote
10. Post-fetch hook script -- fetches refs/notes/traces from remote
11. Installer -- sets core.hooksPath, configures refspecs
GitHub App (separate project, depends on everything above)
12. App that listens for PR webhooks, reads notes from repo, loads traces by external ID, posts comment
Where do you want to start? Items 1 and 2 are the foundation everything else depends on, and they're scoped to the CLI.