Context
Surfaced during Gemini review of #756 (the develop→main release PR for v0.3.0-rc.1). All findings are in code shipped via PR #743 to main last night — not introduced by #756. Listed here as a single consolidated hardening track.
rc.1 ships to @next for burn-in without blocking on these. Fixing #1 is a P1 blocker before promoting rc.1 → 0.3.0 stable under @latest.
Findings
P1 — blocks @latest promotion
P2 — fix in v0.3.1
P3 — fix when it becomes a problem
Release sequencing
- 0.3.0-rc.1 →
@next (ship now, burn-in)
- Fix P1 (telemetry key) → 0.3.0-rc.2 or 0.3.0 →
@latest
- Bundle P2 items into 0.3.1
- P3 when telemetry or user reports indicate scale is triggering it
Context
Surfaced during Gemini review of #756 (the develop→main release PR for v0.3.0-rc.1). All findings are in code shipped via PR #743 to main last night — not introduced by #756. Listed here as a single consolidated hardening track.
rc.1 ships to
@nextfor burn-in without blocking on these. Fixing #1 is a P1 blocker before promoting rc.1 → 0.3.0 stable under@latest.Findings
P1 — blocks @latest promotion
src/lib/telemetry.ts:34Write-only key stored in source. Violates previously-stated engineering guidelines. Even write-only keys can be abused (flooding, polluting analytics) and can't be rotated without a code change. Move to env var:
process.env.SQUADS_TELEMETRY_KEYwith a build-time inject if needed for the public binary.P2 — fix in v0.3.1
Broad catch block swallows auth errors —
src/commands/credentials.ts:149Current catch swallows
gcloud auth expired, leading to confusing downstream failures when the tool assumes the API is enabled. Re-throw auth errors explicitly.Aggressive YAML comment stripping —
src/lib/config.ts:75Strips any
#character, truncating URLs with fragments or string values like"C#". Per YAML spec, a comment must be preceded by a space — use/\s+#/to locate the comment start.Brittle LLM task-assignment parsing —
src/lib/workflow.ts:559Regex +
includes()matching on agent names fails when names share substrings (e.g.leadmatchesteam-lead). Switch the LLM prompt to request JSON-structured task assignments for reliable parsing.P3 — fix when it becomes a problem
Promise.allover squads —src/commands/run.ts:240Running many squads in parallel without concurrency control risks 429 rate limits and local resource exhaustion. Add a concurrency limiter (p-limit or similar) when squad count > 8.
Release sequencing
@next(ship now, burn-in)@latest