log-analyzer

from eddiebe147/claude-settings

No description

6 stars1 forksUpdated Jan 22, 2026
npx skills add https://github.com/eddiebe147/claude-settings --skill log-analyzer

SKILL.md

Log Analyzer Skill

Overview

This skill helps you effectively analyze application logs to diagnose issues, track errors, and understand system behavior. Covers log searching, pattern detection, structured logging, and integration with monitoring tools.

Log Analysis Philosophy

Principles

  1. Structure over text: Structured logs are easier to analyze
  2. Context matters: Include relevant metadata
  3. Levels have meaning: Use appropriate severity levels
  4. Correlation is key: Link related events across services

Log Levels

LevelWhen to UseExample
ERRORSomething failed, needs attentionDatabase connection failed
WARNUnexpected but handledRetry succeeded after failure
INFONormal operation milestonesUser signed up
DEBUGDetailed diagnostic infoQuery executed in 50ms
TRACEVery detailed, usually disabledFunction entered/exited

Structured Logging

Winston Configuration (Node.js)

// src/lib/logger.ts
import winston from 'winston';

const logger = winston.createLogger({
  level: process.env.LOG_LEVEL || 'info',
  format: winston.format.combine(
    winston.format.timestamp(),
    winston.format.errors({ stack: true }),
    winston.format.json()
  ),
  defaultMeta: {
    service: 'my-app',
    environment: process.env.NODE_ENV,
  },
  transports: [
    // Console for development
    new winston.transports.Console({
      format: process.env.NODE_ENV === 'development'
        ? winston.format.combine(
            winston.format.colorize(),
            winston.format.simple()
          )
        : winston.format.json(),
    }),
  ],
});

// Add request context
export function createRequestLogger(requestId: string, userId?: string) {
  return logger.child({
    requestId,
    userId,
  });
}

export { logger };

Pino Logger (High Performance)

// src/lib/logger.ts
import pino from 'pino';

export const logger = pino({
  level: process.env.LOG_LEVEL || 'info',
  transport: process.env.NODE_ENV === 'development'
    ? {
        target: 'pino-pretty',
        options: {
          colorize: true,
          translateTime: 'SYS:standard',
        },
      }
    : undefined,
  base: {
    service: 'my-app',
    env: process.env.NODE_ENV,
  },
  redact: {
    paths: ['password', 'token', 'apiKey', '*.password', '*.token'],
    censor: '[REDACTED]',
  },
});

// Request-scoped logger
export function requestLogger(requestId: string, userId?: string) {
  return logger.child({ requestId, userId });
}

Logging Best Practices

// DO: Include context
logger.info('User signed up', {
  userId: user.id,
  email: user.email,
  plan: 'free',
  source: 'web',
});

// DO: Log errors with stack traces
logger.error('Payment failed', {
  error: err.message,
  stack: err.stack,
  userId: user.id,
  amount: 99.99,
  provider: 'stripe',
});

// DO: Use appropriate levels
logger.debug('Database query', {
  query: 'SELECT * FROM users WHERE id = ?',
  duration: 45,
  rows: 1,
});

// DON'T: Log sensitive data
// BAD: logger.info('Login', { password: user.password })

// DON'T: Use string concatenation
// BAD: logger.info('User ' + user.id + ' signed up')

Log Searching & Filtering

Command Line (grep, jq)

# Search JSON logs with jq
cat logs.json | jq 'select(.level == "error")'

# Search for specific user
cat logs.json | jq 'select(.userId == "usr_123")'

# Search by time range
cat logs.json | jq 'select(.timestamp >= "2024-01-15T10:00:00")'

# Count errors by type
cat logs.json | jq 'select(.level == "error") | .error.code' | sort | uniq -c

# Extract specific fields
cat logs.json | jq '{timestamp, level, message, userId}'

# Search text logs
grep -i "error" logs.txt
grep -E "error|warn" logs.txt
grep -A5 "ERROR" logs.txt  # 5 lines after match
grep -B3 "ERROR" logs.txt  # 3 lines before match

Vercel Log Analysis

# View live logs
vercel logs --follow

# Filter by level
vercel logs --level error

# Search specific timeframe
vercel logs --since 2h

# Filter by deployment
vercel logs --deployment-url https://myapp-abc123.vercel.app

# Output JSON for processing
vercel logs --output json | jq 'select(.level == "error")'

Supabase Log Analysis

-- Query Postgres logs
SELECT *
FROM postgres_logs
WHERE timestamp > now() - interval '1 hour'
  AND error_severity = 'ERROR'
ORDER BY timestamp DESC
LIMIT 100;

-- Query auth logs
SELECT *
FROM auth.audit_log_entries
WHERE timestamp > now() - interval '24 hours'
  AND payload->>'action' = 'login'
ORDER BY timestamp DESC;

-- Find slow queries
SELECT
  query,
  calls,
  mean_exec_time,
  total_exec_time
FROM pg_stat_statements
ORDER BY mean_exec_time DESC
LIMIT 20;

Pattern Detection

Common Error Patterns

// Error pattern analyzer
interface ErrorPattern {
  pattern: RegExp;
  category: string;
  severity: 'critical' | 'high' | 'medium' | 'low';
  s

...
Read full content

Repository Stats

Stars6
Forks1