Debugging a User Issue with PostHog and Claude Code
A user of LumifyHub reported that their onboarding data wasn’t saving. I couldn’t reproduce it. Their account looked fine in the database. The code looked correct.
I needed to see what actually happened during their session.
The Problem
The user signed up via Google OAuth, went through onboarding, but their preferences weren’t saved. The onboarding API returned success, but the database had no record.
I checked the code. The PATCH endpoint was doing an UPDATE on the user_onboarding table. Straightforward.
const { data, error } = await supabase
.from("user_onboarding")
.update(updateData)
.eq("user_id", userId)
The problem: UPDATE requires the row to exist. For OAuth users who skip the profile creation step, there’s no row to update. The query succeeds but affects zero rows.
I didn’t figure this out by staring at the code. I figured it out by watching the user’s session.
Building the Analysis Tools
I had PostHog set up with basic pageviews. That wasn’t enough. I needed to see:
- What the user did step by step
- What API calls were made
- What errors (if any) were logged to console
I asked Claude Code to help me build scripts to pull this data from the PostHog API.
First, a user analysis script that combines database and PostHog data:
./scripts/analytics/analyze-user.sh user@example.com
This shows:
- User info from the database (created_at, provider, onboarding status)
- Their onboarding record (or lack thereof)
- Recent PostHog events in chronological order
- Links to their session recordings
Second, a recording analysis script:
./scripts/analytics/analyze-recording.sh <recording-id>
This pulls:
- Session duration, activity metrics
- Console logs, warnings, errors
- Page flow with timestamps
- Detected issues (rage clicks, errors, dead clicks)
Claude Code wrote the bash scripts, handled the PostHog API pagination, formatted the output with colors and sections. I described what I wanted, iterated on the format, and had working tools in under an hour.
Finding the Bug
I ran the user analysis script. No onboarding record in the database, but PostHog showed they completed onboarding. The events were there: onboarding_step_completed, onboarding_finished.
I watched their session recording. They clicked through onboarding, hit save, saw the success state, moved on. From their perspective, everything worked.
Then I enabled console log capture in PostHog and looked at a similar user’s session. No errors. The API returned 200. But in the database, no row was created.
That’s when it clicked: OAuth users don’t go through email verification, which is where the initial user_onboarding row gets created. The UPDATE was succeeding against a non-existent row.
The fix was simple - change UPDATE to UPSERT:
const { data, error } = await supabase
.from("user_onboarding")
.upsert(updateData, {
onConflict: "user_id",
ignoreDuplicates: false
})
What I Learned
Session recordings are underrated for debugging. I’ve had PostHog for months and never used them. Watching a user’s actual session showed me exactly where the flow broke down.
Claude Code is good at API integration scripts. I gave it the PostHog API docs and described the output format I wanted. It handled auth, pagination, error cases, and colored terminal output. These are throwaway scripts, but they’re useful enough that I’ll keep them.
Console log capture in session recordings is powerful. Enable it. When a user reports an issue, you can see exactly what logged to their console during that session.
The bug was in the happy path. OAuth signup is the happy path - no email verification, fewer steps. But that’s exactly where the bug was. The code assumed a database state that only existed for email signups.
The Scripts
If you want to build similar tools, the approach is:
- Get your PostHog API key from Project Settings
- Use the
/api/projects/{id}/eventsendpoint for event queries - Use
/api/projects/{id}/session_recordingsfor recordings - Use
/api/projects/{id}/session_recordings/{id}for recording details
The scripts are bash + curl + jq. Claude Code can write them for your specific use case. Just describe what data you need and how you want it formatted.
For the database side, I used psql directly with the connection string from my .env.local. The script combines both sources to give a complete picture of the user’s journey.
The whole debugging session - from “I can’t reproduce this” to “deployed fix” - took about two hours. Most of that was building the tooling. Next time it’ll be faster.