Logging and Debugging Cursor Hooks

John Lindquist
InstructorJohn Lindquist

Social Share Links

Tweet

Debugging hooks can be confusing at first. Using console.log doesn't work because stdout is reserved for communicating with the AI. You need a different approach for both quick debugging and production-ready logging.

This lesson takes you from quick debugging with console.error to building a robust, persistent logging system using JSONL files. You'll learn why certain outputs are silent, how to see immediate debug information, and how to create a production logging solution with automatic log rotation—all with the AI's help.

The Immediate Fix: Use console.error

The simplest way to see debug output from your hooks is to use console.error instead of console.log. This directs your log messages to standard error (stderr), which Cursor displays in the "Hooks" output panel.

// This will be silent in the output panel
console.log(JSON.stringify(output, null, 2));

// This will appear in the STDERR section of the output panel
console.error(JSON.stringify(output, null, 2));

While effective for quick checks, this method's output is transient and can be lost among other messages. For more reliable debugging, a persistent logging solution is necessary.

Building a Persistent Log File

For production hooks, you need a complete history of what your hooks have done. Writing logs to a dedicated file gives you this history, making complex debugging much easier.

The JSON Lines (.jsonl) format is ideal for hook logging. Each line is a complete, valid JSON object, and you can append new entries indefinitely. This format is both human-readable and easy to process with tools.

The implementation is straightforward:

  • Create a logs directory in your project
  • Use Node.js's fs/promises and path modules to append data
  • Enrich each log entry with timestamps and the full input payload for context
import { appendFile } from "node:fs/promises";
import { join } from "node:path";

// ... inside your hook logic
const output = {
    timestamp: new Date().toISOString(),
    ...input, // Include the full input payload for context
    stdout: result.stdout.toString(),
    stderr: result.stderr.toString(),
    exitCode: result.exitCode,
};

const logFilePath = join(input.workspace_roots[0]!, "logs", "after-file-edit.jsonl");
await appendFile(logFilePath, JSON.stringify(output) + '\n');

Automating Log Management with AI

Log files grow forever if left unchecked, consuming disk space and becoming unwieldy. Instead of manually implementing log rotation, you can ask the AI to build it for you.

Give the AI your logging code and a clear instruction:

Once there are over 500 JSON objects in this file, please remove the first 100 to keep this log rolling.

The AI will generate code to count entries, trim old ones when the threshold is reached, and keep your logs at a manageable size. You can iterate with the AI to refine the logic—like ensuring the counting method correctly handles only top-level objects, not nested ones.

This demonstrates a powerful workflow: using Cursor's AI to build the very automation that manages Cursor's hooks. The AI becomes your co-pilot for creating production-ready hook systems.

Don't Forget: .gitignore

Always add your logs directory to .gitignore to prevent committing large or sensitive log files:

# .gitignore
logs/

With this complete logging strategy, you have both immediate debugging visibility and a persistent, manageable history. Your hooks become production-ready—reliable, observable, and easy to troubleshoot when issues arise.


Prompts

@index.ts Please add some comments.
Add some more comments
remove all the comments
Add the comments back
remove the comments again
Once there are over 500 JSON objects in this file, please remove the first 100 to keep this log rolling.
Make sure this is only checking an opening brace on a new line to avoid matching nested objects.

Code Snippets

// Use console.error for debugging output in hooks
console.error(JSON.stringify(output, null, 2));
// Import Node.js modules for file operations
import { appendFile, readFile, writeFile } from "node:fs/promises";
import { join } from "node:path";
// Setup for logging to a JSON Lines file
const output = {
    timestamp: new Date().toISOString(),
    ...input,
    stdout: result.stdout.toString(),
    stderr: result.stderr.toString(),
    exitCode: result.exitCode,
};

const logFilePath = join(input.workspace_roots[0]!, "logs", "after-file-edit.jsonl");
await appendFile(logFilePath, JSON.stringify(output) + '\n');
// Refined regex to count top-level JSON objects at the start of a line
const entryCount = (logContent.match(/^s*{/gm) || []).length;

[00:00] Now you'll notice in the output if you use console.log and we're just going to log out essentially what the result gave us back. I'll open an agent, I'll point to our index and say please add some comments, hit enter. You'll notice that the output is empty and there's no log of what happened even though we are using console.log, since console.log is essentially reserved for standard out, which would typically be a way of pushing messages back to the agent, but that's not supported yet in after file edit hooks, even though I hope that someday it will be. So what you need to do is switch this over to console error and I'll say add some more comments. Now once this runs, now you'll see if we scroll down that we have standard error in the logs where we have the entirety of standard out.

[00:49] Standard error for this command was empty with an Xcode of 0 meaning it passed successfully. Now what I recommend doing is setting up a logs directory somewhere in your project or somewhere in your system. We're going to take the most basic approach and create a logs directory here. Now we're just going to use node's file system methods here. So we're going to import from nodefs promises and grab append file so that we can set up a log file path.

[01:19] And this can be the input workspace routes. And we'll grab the first route. And then to properly resolve this, let's also import some path utils. So we'll join from node path and refactor this. And since we know workspace roots, zero is always defined, we'll just go ahead and force it.

[01:39] And the format I'm gonna use here, instead of biome.log, we'll just call this after file edit. And instead of .log, we're gonna use a format called JSONL, which means it's simply lines of JSON objects. So instead of one large JSON file, every line in it is a separate JSON object. So when we append to this file we're just going to append JSON objects, which are the result of our formatter. So now if I say remove all the comments, hit enter, we can now check our logs.

[02:10] You'll see an after file edit JSONL file, and then every time this is run we get a new JSON object. I'll say add the comments back, and you see we get JSON objects every single time. Now I'm also going to add a timestamp here just so we can fully capture the history of these, and it certainly makes sense to grab the information off of the input as well. Then I'll say remove the comments again. And now we have a proper log of every time our after file edit runs we have these objects which if something goes wrong we could copy paste this into an agent and tell it to please fix or follow up or take care of whatever happened that went wrong and we have enough context and data to work with.

[02:53] Now I fully understand that logs are always much more complicated than you think because you have to worry about file size limits and retention like if you should keep old dogs and compress them and store them somewhere. There are a lot of popular logging libraries out there like Pino. It really depends on what your goals are for the log. For our simple scenario I'm going to select these lines, Ask the agent, once there are over 500 JSON objects in this file, please remove the first 100 to keep this log rolling. Hit enter.

[03:25] Let an agent do all the work for us. And I will check out the work myself. We need to import readfile. Looks like it imported the wrong Read file, so we need read file from promises Then back down. We'll fix the append file and then this looks pretty legit to me as far as counting the JSON objects, but I do think this needs to account for...

[03:53] Make sure this is only checking an opening brace on a new line to avoid matching nested objects. And then this looks much better than the last time, but I'm not going to worry about it too much. Just be aware that once you start adding logs and agents are going to be updating a lot of TypeScript files, your logs get long. So whether you roll your own log management or you use a library, whatever works best for you. And last thing, please make sure to add your logs directory to your gitignore.

[04:25] We haven't initialized this project yet, but once it's initialized, we do not want this being checked in and having a billion logging conflicts.