Add: GeoJSON + email support#2
Conversation
📝 WalkthroughWalkthroughAdds pluggable email verification (SMTP/Postmark) with environment configuration and templates, removes the Alert↔DataSource relation (sourceId/source removed), adds per-location geographic coordinates and a GeoJSON scalar with resolvers and SQL helpers, expands seed data, and introduces messaging providers, a provider registry, and nodemailer dependency. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant AuthResolver as Auth Resolver
participant DB as Database
participant Registry as Provider Registry
participant EmailProv as Email Provider (SMTP/Postmark)
Client->>AuthResolver: requestEmailVerification(email)
AuthResolver->>DB: check recent verification token (throttle)
alt rate limited
AuthResolver-->>Client: RATE_LIMITED error
else proceed
AuthResolver->>DB: delete existing tokens
AuthResolver->>DB: create verification token (24h)
AuthResolver->>AuthResolver: build verification URL (FRONTEND_URL + token)
AuthResolver->>AuthResolver: render email template
AuthResolver->>Registry: getEmailProvider()
Registry-->>EmailProv: load and return provider instance
AuthResolver->>EmailProv: send(email, subject, body/html)
alt send success
EmailProv-->>AuthResolver: true
AuthResolver-->>Client: true
else send failure
EmailProv-->>AuthResolver: error/false
AuthResolver-->>Client: INTERNAL_SERVER_ERROR
end
end
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 8
🧹 Nitpick comments (6)
src/resolvers/scalars.resolver.ts (1)
30-32: Consider handling object/list literals inGeoJSON.parseLiteralfor future input usage.
GeoJSONcurrently appears only as output fields in theLocationtype and is not yet used in any input positions. However, the currentparseLiteralimplementation only acceptsKind.STRINGand returnsnullfor other literal kinds. IfGeoJSONis later added to input types or mutation arguments, clients will be unable to provide inline object literals like{ type: "Point", coordinates: [...] }and the scalar will silently returnnull. To prevent this issue when the schema evolves, consider either:
- Walking
Kind.OBJECTandKind.LISTto recursively parse nested structures, or- Throwing an explicit error for unsupported literal kinds instead of returning
null🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/resolvers/scalars.resolver.ts` around lines 30 - 32, The GeoJSON.parseLiteral currently only accepts Kind.STRING and returns null for other AST kinds; update GeoJSON.parseLiteral to either recursively handle Kind.OBJECT and Kind.LIST (walk the AST to reconstruct objects, arrays, numbers, booleans, and null) so inline object/list literals like { type: "Point", coordinates: [...] } are parsed correctly, or explicitly throw an error for unsupported literal kinds; locate the parseLiteral implementation in the GeoJSON scalar and implement an AST-to-JS conversion (handling Kind.OBJECT, Kind.LIST, Kind.STRING, Kind.INT/FLOAT, Kind.BOOLEAN, Kind.NULL) or replace the null return with a descriptive throw to avoid silent failures.src/utils/env.ts (1)
14-15: Consider usingz.enum()forEMAIL_PROVIDERto catch typos early.Using
z.enum(["smtp", "postmark"])instead ofz.string()would provide compile-time type safety and fail-fast validation on startup if an invalid provider is configured.♻️ Proposed fix
- EMAIL_PROVIDER: z.string().default("smtp"), + EMAIL_PROVIDER: z.enum(["smtp", "postmark"]).default("smtp"),As per coding guidelines: Validate environment variables using Zod in
src/utils/env.ts.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/utils/env.ts` around lines 14 - 15, The EMAIL_PROVIDER schema currently uses z.string().default("smtp"), which won't validate allowed values; replace that with z.enum(["smtp","postmark"]).default("smtp") (i.e., swap z.string() for z.enum([...]) while keeping the default) and update any usages or TypeScript types that expect the old string (references: EMAIL_PROVIDER, z.string()) so the codebase benefits from fail-fast validation and stricter typing.src/resolvers/auth.resolver.ts (1)
77-79: Consider structured logging and PII awareness.Logging the user's email address is useful for operational debugging, but consider using structured logging (e.g., with a correlation ID) and ensure this aligns with your data retention policies for logs containing user identifiers.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/resolvers/auth.resolver.ts` around lines 77 - 79, Replace the plain console.log that prints user.email with structured, PII-aware logging: stop emitting raw user.email from the console.log in this resolver (the line using user.email) and instead use the application's structured logger (e.g., processLogger or the project logger) to emit a log object that includes a masked or hashed email (or a boolean indicating presence) and a correlation/request ID (if available) so you retain traceability without logging raw identifiers; update the log call site that currently reads `[AUTH] Verification email sent to ${user.email}` to use the structured logger, maskedEmail/emailHash and correlationId fields.src/services/messaging/registry.ts (1)
23-23: Use the centralizedenvobject forEMAIL_PROVIDER.This reads
process.env.EMAIL_PROVIDERdirectly, bypassing the Zod-validatedenvobject insrc/utils/env.ts. Useenv.EMAIL_PROVIDERfor consistency.♻️ Proposed fix
+import { env } from "../../utils/env.js"; import type { EmailProvider, SMSProvider } from "./types.js"; // ... export async function getEmailProvider(): Promise<EmailProvider> { if (_emailProvider) return _emailProvider; - const providerName = (process.env.EMAIL_PROVIDER ?? "smtp").toLowerCase(); + const providerName = env.EMAIL_PROVIDER.toLowerCase();🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/services/messaging/registry.ts` at line 23, The code reads process.env.EMAIL_PROVIDER directly when creating providerName; replace that with the Zod-validated env object (env.EMAIL_PROVIDER) to ensure validated configuration. Update the top of the module to import the env object if not already imported, then compute providerName using (env.EMAIL_PROVIDER ?? "smtp").toLowerCase() and remove the direct process.env usage; keep the fallback and toLowerCase call as before.src/services/messaging/providers/smtp.ts (1)
18-24: Use the centralizedenvobject instead of readingprocess.envdirectly.The SMTP configuration duplicates defaults that are already validated in
src/utils/env.ts. Import and use theenvobject to ensure consistency and benefit from Zod validation.♻️ Proposed fix
+import { env } from "../../../utils/env.js"; import nodemailer from "nodemailer"; import type { Transporter } from "nodemailer"; import type { EmailProvider, SendEmailOptions } from "../types.js"; export class SMTPEmailProvider implements EmailProvider { private transporter: Transporter; private defaultFrom: string; constructor() { - const host = process.env.SMTP_HOST ?? ""; - const port = parseInt(process.env.SMTP_PORT ?? "587", 10); - const user = process.env.SMTP_USER ?? ""; - const pass = process.env.SMTP_PASS ?? ""; - this.defaultFrom = - process.env.SMTP_FROM ?? "noreply@clear-platform.org"; + const host = env.SMTP_HOST ?? ""; + const port = env.SMTP_PORT; + const user = env.SMTP_USER ?? ""; + const pass = env.SMTP_PASS ?? ""; + this.defaultFrom = env.SMTP_FROM;As per coding guidelines: Validate environment variables using Zod in
src/utils/env.ts.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/services/messaging/providers/smtp.ts` around lines 18 - 24, Replace direct process.env access in the SMTP provider constructor with values from the centralized validated env object: import and use env.SMTP_HOST, env.SMTP_PORT, env.SMTP_USER, env.SMTP_PASS and env.SMTP_FROM (instead of process.env.SMTP_* and hardcoded defaults) and assign them to the existing variables (host, port, user, pass, this.defaultFrom) in the constructor; remove duplicate default logic since env.ts already provides validated defaults and convert env.SMTP_PORT to a number if necessary before assigning to port.src/resolvers/location.resolver.ts (1)
35-36: Module-level cache with singleton PrismaClient will persist stale data across requests.The geoCache is defined at module scope and keyed on the singleton PrismaClient instance. Since the PrismaClient is never garbage collected (singleton pattern), the WeakMap will never trigger cleanup, and cached Promises will be reused across all requests. If Location geo data is queried in request A and then modified before request B, the cached Promise from A will incorrectly return stale data in B.
Either use a per-request cache (e.g., store in Context), or add explicit cache invalidation when Location geo data is mutated.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/resolvers/location.resolver.ts` around lines 35 - 36, The module-level geoCache (WeakMap<PrismaClient, Map<string, Promise<LocationGeoRow | null>>>) lives across the app because the singleton PrismaClient never gets GC'd, causing stale geo Promises to leak between requests; change this to a per-request cache (store a Map<string, Promise<LocationGeoRow | null>> on the GraphQL Context object and read/write that instead of the module-scoped geoCache) or, if you must keep a shared cache, add explicit invalidation in the Location mutation resolvers (where Location geo is updated) to delete relevant keys from geoCache; update all usages that currently reference geoCache to look up the per-request cache on context (and ensure the resolver functions creating/reading entries use contextCache.get/set rather than the module WeakMap).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.env.example:
- Around line 14-22: The example .env pairs EMAIL_PROVIDER=smtp with
SMTP_HOST=smtp.gmail.com but uses an unrelated SMTP_FROM
(noreply@clear-platform.org); update the example so SMTP_FROM is either blank or
matches the SMTP_USER/verified Gmail address (e.g., set SMTP_FROM to the same
address as SMTP_USER or to a neutral placeholder like your-email@example.com)
and add a short inline comment reminding users that SMTP_FROM must match a
verified mailbox/alias when using Gmail (referencing EMAIL_PROVIDER, SMTP_HOST,
SMTP_USER, SMTP_FROM).
In `@prisma/migrations/20260308114835_remove_alert_sourceid/migration.sql`:
- Around line 1-14: The migration currently drops the Alert table's sourceId
column (see DROP CONSTRAINT "Alert_sourceId_fkey", DROP INDEX
"Alert_sourceId_idx", and ALTER TABLE "Alert" DROP COLUMN "sourceId"), which
will irreversibly delete provenance; add a preflight guard or backfill step
before the DROP: either 1) add a SQL preflight check (e.g., a DO $$ BEGIN ... IF
EXISTS (SELECT 1 FROM "Alert" WHERE "sourceId" IS NOT NULL) THEN RAISE EXCEPTION
... END IF; END $$;) to abort the migration if any non-null sourceId exists, or
2) implement a backfill in this migration that migrates/archives non-null
sourceId values into the new schema/location before executing DROP
CONSTRAINT/DROP INDEX/DROP COLUMN, and only proceed to drop after verifying the
backfill completed successfully.
In `@prisma/seed.ts`:
- Line 173: Remove the unused det10 binding from the destructuring of the
Promise.all result (the const [det1, det2, ..., det10] = await
Promise.all([...]) line) — either drop the 10th variable entirely or rename it
to an ignored variable like _det10 so `@typescript-eslint/no-unused-vars` is no
longer triggered; update only the destructuring to match the number of promised
results returned by Promise.all.
In `@src/resolvers/location.resolver.ts`:
- Around line 200-209: The point and boundary field resolvers call
fetchLocationGeo and then JSON.parse on geo.point_geojson / geo.boundary_geojson
which can throw on malformed JSON; wrap each JSON.parse in a try-catch (or use a
shared safeParseJSON helper) inside the point and boundary resolver functions to
catch parsing errors, log the error (including parent.id and which field
failed), and return null on failure instead of letting the exception propagate.
- Around line 51-66: The implicit any error comes from the .then callback of
prisma.$queryRaw; explicitly type the promise and the .then parameter: declare
the promise with a concrete Promise type (e.g., Promise<LocationGeoRow | null>)
for the const promise and annotate the .then callback parameter as (rows:
LocationGeoRow[]) so TypeScript knows the row type; ensure cache.set(id,
promise) still receives that typed promise and adjust generics on
prisma.$queryRaw<LocationGeoRow[]> if necessary to match these annotations.
- Line 3: The import in location.resolver.ts currently pulls PrismaClient from
the generated gitignored path; change it to import the PrismaClient type from
your Prisma singleton module by exporting the type from src/lib/prisma.ts (add
`export type { PrismaClient }`) and then update the import in
src/resolvers/location.resolver.ts to import PrismaClient from that singleton
module; apply the same change to other files mentioned (src/context.ts,
notification.resolver.ts, detection.resolver.ts) so all PrismaClient types come
from the src/lib/prisma.ts singleton.
In `@src/services/messaging/providers/postmark.ts`:
- Around line 71-73: The console.log in the Postmark provider is printing PII
(recipient emails) — update the logging in the send routine (the console.log
that references options.to and any other logs in this file that print email
addresses, e.g., the blocks around lines currently showing 32-34, 92-94, 98-100)
to either omit the full address or mask it (e.g., show only domain or first/last
character) and use an appropriate log level for production (e.g., debug) instead
of unmasked info; locate the send function and replace references to
options.to/from in logs with a maskedEmail helper or remove them, and ensure any
error logs preserve non-PII context while not exposing full recipient addresses.
In `@src/services/messaging/registry.ts`:
- Line 54: The code reads process.env.SMS_PROVIDER via providerName in
registry.ts but SMS_PROVIDER is not declared in the Zod env schema; open the Zod
schema where EMAIL_PROVIDER is defined (the env schema variable that contains
EMAIL_PROVIDER) and add SMS_PROVIDER: z.string().default("twilio") to that
schema so the env validation supplies the default and TypeScript types include
SMS_PROVIDER; adjust any exported env/types usage if necessary to pick up the
new property.
---
Nitpick comments:
In `@src/resolvers/auth.resolver.ts`:
- Around line 77-79: Replace the plain console.log that prints user.email with
structured, PII-aware logging: stop emitting raw user.email from the console.log
in this resolver (the line using user.email) and instead use the application's
structured logger (e.g., processLogger or the project logger) to emit a log
object that includes a masked or hashed email (or a boolean indicating presence)
and a correlation/request ID (if available) so you retain traceability without
logging raw identifiers; update the log call site that currently reads `[AUTH]
Verification email sent to ${user.email}` to use the structured logger,
maskedEmail/emailHash and correlationId fields.
In `@src/resolvers/location.resolver.ts`:
- Around line 35-36: The module-level geoCache (WeakMap<PrismaClient,
Map<string, Promise<LocationGeoRow | null>>>) lives across the app because the
singleton PrismaClient never gets GC'd, causing stale geo Promises to leak
between requests; change this to a per-request cache (store a Map<string,
Promise<LocationGeoRow | null>> on the GraphQL Context object and read/write
that instead of the module-scoped geoCache) or, if you must keep a shared cache,
add explicit invalidation in the Location mutation resolvers (where Location geo
is updated) to delete relevant keys from geoCache; update all usages that
currently reference geoCache to look up the per-request cache on context (and
ensure the resolver functions creating/reading entries use contextCache.get/set
rather than the module WeakMap).
In `@src/resolvers/scalars.resolver.ts`:
- Around line 30-32: The GeoJSON.parseLiteral currently only accepts Kind.STRING
and returns null for other AST kinds; update GeoJSON.parseLiteral to either
recursively handle Kind.OBJECT and Kind.LIST (walk the AST to reconstruct
objects, arrays, numbers, booleans, and null) so inline object/list literals
like { type: "Point", coordinates: [...] } are parsed correctly, or explicitly
throw an error for unsupported literal kinds; locate the parseLiteral
implementation in the GeoJSON scalar and implement an AST-to-JS conversion
(handling Kind.OBJECT, Kind.LIST, Kind.STRING, Kind.INT/FLOAT, Kind.BOOLEAN,
Kind.NULL) or replace the null return with a descriptive throw to avoid silent
failures.
In `@src/services/messaging/providers/smtp.ts`:
- Around line 18-24: Replace direct process.env access in the SMTP provider
constructor with values from the centralized validated env object: import and
use env.SMTP_HOST, env.SMTP_PORT, env.SMTP_USER, env.SMTP_PASS and env.SMTP_FROM
(instead of process.env.SMTP_* and hardcoded defaults) and assign them to the
existing variables (host, port, user, pass, this.defaultFrom) in the
constructor; remove duplicate default logic since env.ts already provides
validated defaults and convert env.SMTP_PORT to a number if necessary before
assigning to port.
In `@src/services/messaging/registry.ts`:
- Line 23: The code reads process.env.EMAIL_PROVIDER directly when creating
providerName; replace that with the Zod-validated env object
(env.EMAIL_PROVIDER) to ensure validated configuration. Update the top of the
module to import the env object if not already imported, then compute
providerName using (env.EMAIL_PROVIDER ?? "smtp").toLowerCase() and remove the
direct process.env usage; keep the fallback and toLowerCase call as before.
In `@src/utils/env.ts`:
- Around line 14-15: The EMAIL_PROVIDER schema currently uses
z.string().default("smtp"), which won't validate allowed values; replace that
with z.enum(["smtp","postmark"]).default("smtp") (i.e., swap z.string() for
z.enum([...]) while keeping the default) and update any usages or TypeScript
types that expect the old string (references: EMAIL_PROVIDER, z.string()) so the
codebase benefits from fail-fast validation and stricter typing.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: c3f9e0f9-f480-4eee-bf10-4cfa498412f8
⛔ Files ignored due to path filters (1)
bun.lockis excluded by!**/*.lock
📒 Files selected for processing (23)
.env.examplepackage.jsonprisma/migrations/20260308114835_remove_alert_sourceid/migration.sqlprisma/schema.prismaprisma/seed.tssrc/resolvers/alert.resolver.tssrc/resolvers/auth.resolver.tssrc/resolvers/dataSource.resolver.tssrc/resolvers/location.resolver.tssrc/resolvers/scalars.resolver.tssrc/schema/typeDefs/mutation.tssrc/schema/typeDefs/scalars.tssrc/schema/typeDefs/types/alert.tssrc/schema/typeDefs/types/dataSource.tssrc/schema/typeDefs/types/location.tssrc/services/messaging/index.tssrc/services/messaging/providers/postmark.tssrc/services/messaging/providers/smtp.tssrc/services/messaging/providers/twilio-sms.tssrc/services/messaging/registry.tssrc/services/messaging/templates.tssrc/services/messaging/types.tssrc/utils/env.ts
💤 Files with no reviewable changes (5)
- src/schema/typeDefs/types/dataSource.ts
- src/resolvers/dataSource.resolver.ts
- src/schema/typeDefs/types/alert.ts
- src/resolvers/alert.resolver.ts
- prisma/schema.prisma
| # Email Provider: "smtp" or "postmark" | ||
| EMAIL_PROVIDER=smtp | ||
|
|
||
| # SMTP Configuration (when EMAIL_PROVIDER=smtp) | ||
| SMTP_HOST=smtp.gmail.com | ||
| SMTP_PORT=587 | ||
| SMTP_USER= | ||
| SMTP_PASS= | ||
| SMTP_FROM=noreply@clear-platform.org |
There was a problem hiding this comment.
Don't pair Gmail SMTP with an unrelated default SMTP_FROM.
With EMAIL_PROVIDER=smtp, this becomes the default copy/paste path, and smtp.gmail.com plus SMTP_FROM=noreply@clear-platform.org is a bad combination for most developers. Gmail usually requires the sender to match a verified mailbox or alias, so this example will often fail until they notice the mismatch.
📧 Safer example
# SMTP Configuration (when EMAIL_PROVIDER=smtp)
SMTP_HOST=smtp.gmail.com
SMTP_PORT=587
SMTP_USER=
SMTP_PASS=
-SMTP_FROM=noreply@clear-platform.org
+# Must be a verified sender for your SMTP provider; with Gmail this usually matches SMTP_USER
+SMTP_FROM=your-account@gmail.com📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| # Email Provider: "smtp" or "postmark" | |
| EMAIL_PROVIDER=smtp | |
| # SMTP Configuration (when EMAIL_PROVIDER=smtp) | |
| SMTP_HOST=smtp.gmail.com | |
| SMTP_PORT=587 | |
| SMTP_USER= | |
| SMTP_PASS= | |
| SMTP_FROM=noreply@clear-platform.org | |
| # Email Provider: "smtp" or "postmark" | |
| EMAIL_PROVIDER=smtp | |
| # SMTP Configuration (when EMAIL_PROVIDER=smtp) | |
| SMTP_HOST=smtp.gmail.com | |
| SMTP_PORT=587 | |
| SMTP_USER= | |
| SMTP_PASS= | |
| # Must be a verified sender for your SMTP provider; with Gmail this usually matches SMTP_USER | |
| SMTP_FROM=your-account@gmail.com |
🧰 Tools
🪛 dotenv-linter (4.0.0)
[warning] 21-21: [UnorderedKey] The SMTP_PASS key should go before the SMTP_PORT key
(UnorderedKey)
[warning] 22-22: [UnorderedKey] The SMTP_FROM key should go before the SMTP_HOST key
(UnorderedKey)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.env.example around lines 14 - 22, The example .env pairs
EMAIL_PROVIDER=smtp with SMTP_HOST=smtp.gmail.com but uses an unrelated
SMTP_FROM (noreply@clear-platform.org); update the example so SMTP_FROM is
either blank or matches the SMTP_USER/verified Gmail address (e.g., set
SMTP_FROM to the same address as SMTP_USER or to a neutral placeholder like
your-email@example.com) and add a short inline comment reminding users that
SMTP_FROM must match a verified mailbox/alias when using Gmail (referencing
EMAIL_PROVIDER, SMTP_HOST, SMTP_USER, SMTP_FROM).
| /* | ||
| Warnings: | ||
|
|
||
| - You are about to drop the column `sourceId` on the `Alert` table. All the data in the column will be lost. | ||
|
|
||
| */ | ||
| -- DropForeignKey | ||
| ALTER TABLE "Alert" DROP CONSTRAINT "Alert_sourceId_fkey"; | ||
|
|
||
| -- DropIndex | ||
| DROP INDEX "Alert_sourceId_idx"; | ||
|
|
||
| -- AlterTable | ||
| ALTER TABLE "Alert" DROP COLUMN "sourceId"; |
There was a problem hiding this comment.
Backfill or guard existing Alert.sourceId data before dropping it.
This migration permanently deletes alert-source provenance. If any deployed environment still has non-null sourceId values, rollout becomes destructive with no preservation step here. Please add a backfill path or at least an explicit preflight abort until the old data is safely migrated.
🛑 Example preflight guard if a backfill is not ready yet
DO $$
BEGIN
IF EXISTS (SELECT 1 FROM "Alert" WHERE "sourceId" IS NOT NULL) THEN
RAISE EXCEPTION 'Backfill Alert.sourceId before applying this migration';
END IF;
END $$;🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@prisma/migrations/20260308114835_remove_alert_sourceid/migration.sql` around
lines 1 - 14, The migration currently drops the Alert table's sourceId column
(see DROP CONSTRAINT "Alert_sourceId_fkey", DROP INDEX "Alert_sourceId_idx", and
ALTER TABLE "Alert" DROP COLUMN "sourceId"), which will irreversibly delete
provenance; add a preflight guard or backfill step before the DROP: either 1)
add a SQL preflight check (e.g., a DO $$ BEGIN ... IF EXISTS (SELECT 1 FROM
"Alert" WHERE "sourceId" IS NOT NULL) THEN RAISE EXCEPTION ... END IF; END $$;)
to abort the migration if any non-null sourceId exists, or 2) implement a
backfill in this migration that migrates/archives non-null sourceId values into
the new schema/location before executing DROP CONSTRAINT/DROP INDEX/DROP COLUMN,
and only proceed to drop after verifying the backfill completed successfully.
| // ─── Detections ────────────────────────────────────────────────────────── | ||
| // Each detection is an independent raw signal from a data source. | ||
| // The pipeline promotes them: Detection → Signal → Event → Alert | ||
| const [det1, det2, det3, det4, det5, det6, det7, det8, det9, det10] = await Promise.all([ |
There was a problem hiding this comment.
Drop the unused det10 binding.
Line 173 still destructures det10, but nothing below reads it, so @typescript-eslint/no-unused-vars will keep flagging this file. Either stop binding the 10th result or rename it to an ignored _-prefixed variable.
✂️ Minimal fix
- const [det1, det2, det3, det4, det5, det6, det7, det8, det9, det10] = await Promise.all([
+ const [det1, det2, det3, det4, det5, det6, det7, det8, det9] = await Promise.all([📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const [det1, det2, det3, det4, det5, det6, det7, det8, det9, det10] = await Promise.all([ | |
| const [det1, det2, det3, det4, det5, det6, det7, det8, det9] = await Promise.all([ |
🧰 Tools
🪛 ESLint
[error] 173-173: 'det10' is assigned a value but never used. Allowed unused vars must match /^_/u.
(@typescript-eslint/no-unused-vars)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@prisma/seed.ts` at line 173, Remove the unused det10 binding from the
destructuring of the Promise.all result (the const [det1, det2, ..., det10] =
await Promise.all([...]) line) — either drop the 10th variable entirely or
rename it to an ignored variable like _det10 so
`@typescript-eslint/no-unused-vars` is no longer triggered; update only the
destructuring to match the number of promised results returned by Promise.all.
| @@ -1,5 +1,6 @@ | |||
| import { GraphQLError } from "graphql"; | |||
| import type { Context } from "../context.js"; | |||
| import type { PrismaClient } from "../generated/prisma/client.js"; | |||
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Check how PrismaClient is imported/exported in the codebase
echo "=== Checking PrismaClient imports in other resolvers ==="
rg -n "import.*PrismaClient" --type ts
echo ""
echo "=== Checking src/lib/prisma.ts exports ==="
cat src/lib/prisma.ts 2>/dev/null || echo "File not found"Repository: CLEAR-Initiative/clear-api
Length of output: 950
🏁 Script executed:
#!/bin/bash
echo "=== How is PrismaClient type used in location.resolver.ts? ==="
rg -A 5 -B 2 "PrismaClient" src/resolvers/location.resolver.ts
echo ""
echo "=== Check for other imports from `@prisma/client` ==="
rg -n "@prisma/client" --type ts
echo ""
echo "=== Check if src/lib/prisma.ts exports PrismaClient type ==="
rg -n "export.*PrismaClient" src/lib/prisma.ts
echo ""
echo "=== Verify the directory structure around generated/ ==="
ls -la src/ | grep -E "(generated|lib)"Repository: CLEAR-Initiative/clear-api
Length of output: 1205
🏁 Script executed:
#!/bin/bash
echo "=== Check src/context.ts ==="
cat -n src/context.ts
echo ""
echo "=== Search for type imports in src/ ==="
rg -n "import type" src/ --type ts | head -20Repository: CLEAR-Initiative/clear-api
Length of output: 4618
Fix the Prisma Client type import to use the singleton module.
The import path ../generated/prisma/client.js references a gitignored directory that doesn't exist in CI, causing the pipeline failure. Per coding guidelines, Prisma imports should go through the singleton at src/lib/prisma.ts.
The fix is to export the PrismaClient type from src/lib/prisma.ts and import it from there:
Required changes
- Update
src/lib/prisma.tsto export the type:
import { PrismaPg } from "@prisma/adapter-pg";
import { PrismaClient } from "../generated/prisma/client.js";
+export type { PrismaClient };
const globalForPrisma = globalThis as unknown as {- Update
src/resolvers/location.resolver.ts:
-import type { PrismaClient } from "../generated/prisma/client.js";
+import type { PrismaClient } from "../lib/prisma.js";Note: This pattern also applies to src/context.ts (line 2) and other resolvers (notification.resolver.ts, detection.resolver.ts) that import types from the gitignored generated path.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| import type { PrismaClient } from "../generated/prisma/client.js"; | |
| import type { PrismaClient } from "../lib/prisma.js"; |
🧰 Tools
🪛 GitHub Actions: CI
[error] 3-3: Cannot find module '../generated/prisma/client.js' or its corresponding type declarations.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/resolvers/location.resolver.ts` at line 3, The import in
location.resolver.ts currently pulls PrismaClient from the generated gitignored
path; change it to import the PrismaClient type from your Prisma singleton
module by exporting the type from src/lib/prisma.ts (add `export type {
PrismaClient }`) and then update the import in
src/resolvers/location.resolver.ts to import PrismaClient from that singleton
module; apply the same change to other files mentioned (src/context.ts,
notification.resolver.ts, detection.resolver.ts) so all PrismaClient types come
from the src/lib/prisma.ts singleton.
| const promise = prisma | ||
| .$queryRaw<LocationGeoRow[]>` | ||
| SELECT | ||
| ST_Y("point"::geometry) as lat, | ||
| ST_X("point"::geometry) as lng, | ||
| ST_AsGeoJSON("point"::geometry) as point_geojson, | ||
| ST_AsGeoJSON("boundary") as boundary_geojson, | ||
| "pointType" as point_type | ||
| FROM "Location" | ||
| WHERE "id" = ${id} | ||
| ` | ||
| .then((rows) => rows[0] ?? null); | ||
|
|
||
| cache.set(id, promise); | ||
| return promise; | ||
| } |
There was a problem hiding this comment.
Add explicit type annotation to fix implicit any error.
The pipeline reports: Parameter 'rows' implicitly has an 'any' type at line 62. The $queryRaw generic type isn't being properly inferred in the .then() callback.
🔧 Proposed fix
const promise = prisma
.$queryRaw<LocationGeoRow[]>`
SELECT
ST_Y("point"::geometry) as lat,
ST_X("point"::geometry) as lng,
ST_AsGeoJSON("point"::geometry) as point_geojson,
ST_AsGeoJSON("boundary") as boundary_geojson,
"pointType" as point_type
FROM "Location"
WHERE "id" = ${id}
`
- .then((rows) => rows[0] ?? null);
+ .then((rows: LocationGeoRow[]) => rows[0] ?? null);📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const promise = prisma | |
| .$queryRaw<LocationGeoRow[]>` | |
| SELECT | |
| ST_Y("point"::geometry) as lat, | |
| ST_X("point"::geometry) as lng, | |
| ST_AsGeoJSON("point"::geometry) as point_geojson, | |
| ST_AsGeoJSON("boundary") as boundary_geojson, | |
| "pointType" as point_type | |
| FROM "Location" | |
| WHERE "id" = ${id} | |
| ` | |
| .then((rows) => rows[0] ?? null); | |
| cache.set(id, promise); | |
| return promise; | |
| } | |
| const promise = prisma | |
| .$queryRaw<LocationGeoRow[]>` | |
| SELECT | |
| ST_Y("point"::geometry) as lat, | |
| ST_X("point"::geometry) as lng, | |
| ST_AsGeoJSON("point"::geometry) as point_geojson, | |
| ST_AsGeoJSON("boundary") as boundary_geojson, | |
| "pointType" as point_type | |
| FROM "Location" | |
| WHERE "id" = ${id} | |
| ` | |
| .then((rows: LocationGeoRow[]) => rows[0] ?? null); | |
| cache.set(id, promise); | |
| return promise; | |
| } |
🧰 Tools
🪛 GitHub Actions: CI
[error] 62-62: Parameter 'rows' implicitly has an 'any' type.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/resolvers/location.resolver.ts` around lines 51 - 66, The implicit any
error comes from the .then callback of prisma.$queryRaw; explicitly type the
promise and the .then parameter: declare the promise with a concrete Promise
type (e.g., Promise<LocationGeoRow | null>) for the const promise and annotate
the .then callback parameter as (rows: LocationGeoRow[]) so TypeScript knows the
row type; ensure cache.set(id, promise) still receives that typed promise and
adjust generics on prisma.$queryRaw<LocationGeoRow[]> if necessary to match
these annotations.
| point: async (parent: { id: string }, _args: unknown, { prisma }: Context) => { | ||
| const geo = await fetchLocationGeo(prisma, parent.id); | ||
| if (!geo?.point_geojson) return null; | ||
| return JSON.parse(geo.point_geojson) as unknown; | ||
| }, | ||
| boundary: async (parent: { id: string }, _args: unknown, { prisma }: Context) => { | ||
| const geo = await fetchLocationGeo(prisma, parent.id); | ||
| if (!geo?.boundary_geojson) return null; | ||
| return JSON.parse(geo.boundary_geojson) as unknown; | ||
| }, |
There was a problem hiding this comment.
Wrap JSON.parse in try-catch for malformed GeoJSON.
If the database contains invalid JSON in point_geojson or boundary_geojson, JSON.parse will throw and crash the resolver. Consider defensive parsing.
🛡️ Suggested defensive approach
point: async (parent: { id: string }, _args: unknown, { prisma }: Context) => {
const geo = await fetchLocationGeo(prisma, parent.id);
if (!geo?.point_geojson) return null;
- return JSON.parse(geo.point_geojson) as unknown;
+ try {
+ return JSON.parse(geo.point_geojson) as unknown;
+ } catch {
+ console.error(`[Location] Invalid point GeoJSON for id=${parent.id}`);
+ return null;
+ }
},
boundary: async (parent: { id: string }, _args: unknown, { prisma }: Context) => {
const geo = await fetchLocationGeo(prisma, parent.id);
if (!geo?.boundary_geojson) return null;
- return JSON.parse(geo.boundary_geojson) as unknown;
+ try {
+ return JSON.parse(geo.boundary_geojson) as unknown;
+ } catch {
+ console.error(`[Location] Invalid boundary GeoJSON for id=${parent.id}`);
+ return null;
+ }
},📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| point: async (parent: { id: string }, _args: unknown, { prisma }: Context) => { | |
| const geo = await fetchLocationGeo(prisma, parent.id); | |
| if (!geo?.point_geojson) return null; | |
| return JSON.parse(geo.point_geojson) as unknown; | |
| }, | |
| boundary: async (parent: { id: string }, _args: unknown, { prisma }: Context) => { | |
| const geo = await fetchLocationGeo(prisma, parent.id); | |
| if (!geo?.boundary_geojson) return null; | |
| return JSON.parse(geo.boundary_geojson) as unknown; | |
| }, | |
| point: async (parent: { id: string }, _args: unknown, { prisma }: Context) => { | |
| const geo = await fetchLocationGeo(prisma, parent.id); | |
| if (!geo?.point_geojson) return null; | |
| try { | |
| return JSON.parse(geo.point_geojson) as unknown; | |
| } catch { | |
| console.error(`[Location] Invalid point GeoJSON for id=${parent.id}`); | |
| return null; | |
| } | |
| }, | |
| boundary: async (parent: { id: string }, _args: unknown, { prisma }: Context) => { | |
| const geo = await fetchLocationGeo(prisma, parent.id); | |
| if (!geo?.boundary_geojson) return null; | |
| try { | |
| return JSON.parse(geo.boundary_geojson) as unknown; | |
| } catch { | |
| console.error(`[Location] Invalid boundary GeoJSON for id=${parent.id}`); | |
| return null; | |
| } | |
| }, |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/resolvers/location.resolver.ts` around lines 200 - 209, The point and
boundary field resolvers call fetchLocationGeo and then JSON.parse on
geo.point_geojson / geo.boundary_geojson which can throw on malformed JSON; wrap
each JSON.parse in a try-catch (or use a shared safeParseJSON helper) inside the
point and boundary resolver functions to catch parsing errors, log the error
(including parent.id and which field failed), and return null on failure instead
of letting the exception propagate.
| console.log( | ||
| `[POSTMARK] Sending email: to=${options.to}, from=${from}, subject="${options.subject}"`, | ||
| ); |
There was a problem hiding this comment.
Avoid logging PII (recipient email addresses).
Logging options.to exposes recipient email addresses in logs, which creates compliance/privacy risks (GDPR, CCPA). Consider masking or omitting email addresses from production logs.
Similarly, lines 32-34, 92-94, and 98-100 log email addresses which should be reviewed for production log levels.
🛡️ Suggested approach
- console.log(
- `[POSTMARK] Sending email: to=${options.to}, from=${from}, subject="${options.subject}"`,
- );
+ console.log(
+ `[POSTMARK] Sending email: subject="${options.subject}"`,
+ );📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| console.log( | |
| `[POSTMARK] Sending email: to=${options.to}, from=${from}, subject="${options.subject}"`, | |
| ); | |
| console.log( | |
| `[POSTMARK] Sending email: subject="${options.subject}"`, | |
| ); |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/services/messaging/providers/postmark.ts` around lines 71 - 73, The
console.log in the Postmark provider is printing PII (recipient emails) — update
the logging in the send routine (the console.log that references options.to and
any other logs in this file that print email addresses, e.g., the blocks around
lines currently showing 32-34, 92-94, 98-100) to either omit the full address or
mask it (e.g., show only domain or first/last character) and use an appropriate
log level for production (e.g., debug) instead of unmasked info; locate the send
function and replace references to options.to/from in logs with a maskedEmail
helper or remove them, and ensure any error logs preserve non-PII context while
not exposing full recipient addresses.
| export async function getSMSProvider(): Promise<SMSProvider> { | ||
| if (_smsProvider) return _smsProvider; | ||
|
|
||
| const providerName = (process.env.SMS_PROVIDER ?? "twilio").toLowerCase(); |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Description: Check if SMS_PROVIDER is defined in env.ts
rg -n 'SMS_PROVIDER' src/utils/env.tsRepository: CLEAR-Initiative/clear-api
Length of output: 52
🏁 Script executed:
cat -n src/utils/env.tsRepository: CLEAR-Initiative/clear-api
Length of output: 1290
🏁 Script executed:
rg -n 'EMAIL_PROVIDER|SMS_PROVIDER' src/ -A 2 -B 2Repository: CLEAR-Initiative/clear-api
Length of output: 1714
🏁 Script executed:
git ls-files src/services/messaging/ | head -20Repository: CLEAR-Initiative/clear-api
Length of output: 337
Add SMS_PROVIDER to the environment validation schema in src/utils/env.ts.
SMS_PROVIDER is accessed at line 54 but is missing from the Zod validation schema. EMAIL_PROVIDER is properly validated (line 15), so SMS_PROVIDER should follow the same pattern with a "twilio" default to maintain consistency:
SMS_PROVIDER: z.string().default("twilio"),
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@src/services/messaging/registry.ts` at line 54, The code reads
process.env.SMS_PROVIDER via providerName in registry.ts but SMS_PROVIDER is not
declared in the Zod env schema; open the Zod schema where EMAIL_PROVIDER is
defined (the env schema variable that contains EMAIL_PROVIDER) and add
SMS_PROVIDER: z.string().default("twilio") to that schema so the env validation
supplies the default and TypeScript types include SMS_PROVIDER; adjust any
exported env/types usage if necessary to pick up the new property.
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/ci.yml:
- Line 21: Restore the commented-out test step by uncommenting the "- run: bun
run test" entry in the CI workflow so tests run as part of the pipeline; if the
test suite requires extra setup or secrets, wrap the test step with a
conditional check on an appropriate env/config (e.g., using if: env.RUN_TESTS ==
'true' or adding required service steps) so tests are executed in CI only when
the necessary environment is available; ensure the step remains named/described
clearly (the "- run: bun run test" step) and that any setup steps required by
tests are included earlier in the workflow.
| - run: bun run lint | ||
| - run: bun run typecheck | ||
| - run: bun run test | ||
| # - run: bun run test |
There was a problem hiding this comment.
Re-enable the test step in CI.
Commenting this out lets PRs merge with only lint/typecheck, so regressions in the new email/GeoJSON paths can slip through untested. If tests currently need extra setup, keep them in CI behind the right env/config instead of disabling them wholesale.
💡 Suggested workflow fix
- # - run: bun run test
+ - run: bun run test📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| # - run: bun run test | |
| - run: bun run test |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.github/workflows/ci.yml at line 21, Restore the commented-out test step by
uncommenting the "- run: bun run test" entry in the CI workflow so tests run as
part of the pipeline; if the test suite requires extra setup or secrets, wrap
the test step with a conditional check on an appropriate env/config (e.g., using
if: env.RUN_TESTS == 'true' or adding required service steps) so tests are
executed in CI only when the necessary environment is available; ensure the step
remains named/described clearly (the "- run: bun run test" step) and that any
setup steps required by tests are included earlier in the workflow.
Summary by CodeRabbit
New Features
Improvements
Improvements