timeout-prevention
ClaudeSkillz: For when you need skills, but lazier
8 stars2 forksUpdated Nov 20, 2025
npx skills add https://github.com/jackspace/claudeskillz --skill timeout-preventionSKILL.md
Timeout Prevention Skill
Expert strategies for preventing request timeouts in Claude Code by breaking down long operations into manageable chunks with progress tracking and recovery mechanisms.
When to Use This Skill
Use when:
- Performing bulk operations (100+ files, repos, etc.)
- Running long commands (>60 seconds)
- Processing large datasets
- Downloading multiple files sequentially
- Executing complex multi-step workflows
- Working with slow external APIs
- Any operation that risks timing out
Understanding Timeout Limits
Claude Code Timeout Behavior
- Default Tool Timeout: 120 seconds (2 minutes)
- Request Timeout: Entire response cycle
- What Causes Timeouts:
- Long-running bash commands
- Large file operations
- Sequential processing of many items
- Network operations without proper timeout settings
- Blocking operations without progress
Core Prevention Strategies
1. Task Chunking
Break large operations into smaller batches:
# ❌ BAD: Process all 1000 files at once
for file in $(find . -name "*.txt"); do
process_file "$file"
done
# ✅ GOOD: Process in batches of 10
find . -name "*.txt" | head -10 | while read file; do
process_file "$file"
done
# Then process next 10, etc.
2. Progress Checkpoints
Save progress to resume if interrupted:
# Create checkpoint file
CHECKPOINT_FILE="progress.txt"
# Save completed items
echo "completed_item_1" >> "$CHECKPOINT_FILE"
# Skip already completed items
if grep -q "completed_item_1" "$CHECKPOINT_FILE"; then
echo "Skipping already processed item"
fi
3. Background Processes
Run long operations in background:
# Start long-running process in background
long_command > output.log 2>&1 &
echo $! > pid.txt
# Check status later
if ps -p $(cat pid.txt) > /dev/null; then
echo "Still running"
else
echo "Completed"
cat output.log
fi
4. Incremental Processing
Process and report incrementally:
# Process items one at a time with status updates
TOTAL=100
for i in $(seq 1 10); do # First 10 of 100
process_item $i
echo "Progress: $i/$TOTAL"
done
# Continue with next 10 in subsequent calls
5. Timeout Configuration
Set explicit timeouts for commands:
# Use timeout command
timeout 60s long_command || echo "Command timed out"
# Set timeout in tool calls
yt-dlp --socket-timeout 30 "URL"
git clone --depth 1 --timeout 60 "URL"
Practical Patterns
Pattern 1: Batch File Processing
# Define batch size
BATCH_SIZE=10
OFFSET=${1:-0} # Accept offset as parameter
# Process batch
find . -name "*.jpg" | tail -n +$((OFFSET + 1)) | head -$BATCH_SIZE | while read file; do
convert "$file" -resize 800x600 "thumbnails/$(basename $file)"
echo "Processed: $file"
done
# Report next offset
NEXT_OFFSET=$((OFFSET + BATCH_SIZE))
echo "Next batch: Run with offset $NEXT_OFFSET"
Usage:
# First batch
./process.sh 0
# Next batch
./process.sh 10
# Continue...
./process.sh 20
Pattern 2: Repository Cloning with Checkpoints
#!/bin/bash
REPOS_FILE="repos.txt"
CHECKPOINT="downloaded.txt"
# Read repos, skip already downloaded
while read repo; do
# Check if already downloaded
if grep -q "$repo" "$CHECKPOINT" 2>/dev/null; then
echo "✓ Skipping $repo (already downloaded)"
continue
fi
# Clone with timeout
echo "Downloading $repo..."
if timeout 60s git clone --depth 1 "$repo" 2>/dev/null; then
echo "$repo" >> "$CHECKPOINT"
echo "✓ Downloaded $repo"
else
echo "✗ Failed: $repo"
fi
done < <(head -5 "$REPOS_FILE") # Only 5 at a time
echo "Processed 5 repos. Check $CHECKPOINT for status."
Pattern 3: Progress-Tracked Downloads
#!/bin/bash
URLS_FILE="urls.txt"
PROGRESS_FILE="download_progress.json"
# Initialize progress
[ ! -f "$PROGRESS_FILE" ] && echo '{"completed": 0, "total": 0}' > "$PROGRESS_FILE"
# Get current progress
COMPLETED=$(jq -r '.completed' "$PROGRESS_FILE")
TOTAL=$(wc -l < "$URLS_FILE")
# Download next batch (5 at a time)
BATCH_SIZE=5
tail -n +$((COMPLETED + 1)) "$URLS_FILE" | head -$BATCH_SIZE | while read url; do
echo "Downloading: $url"
if yt-dlp --socket-timeout 30 "$url"; then
COMPLETED=$((COMPLETED + 1))
echo "{\"completed\": $COMPLETED, \"total\": $TOTAL}" > "$PROGRESS_FILE"
fi
done
# Report progress
COMPLETED=$(jq -r '.completed' "$PROGRESS_FILE")
echo "Progress: $COMPLETED/$TOTAL"
[ $COMPLETED -lt $TOTAL ] && echo "Run again to continue."
Pattern 4: Parallel Execution with Rate Limiting
#!/bin/bash
MAX_PARALLEL=3
COUNT=0
for item in $(seq 1 15); do
# Start background job
(
process_item $item
echo "Completed: $item"
) &
COUNT=$((COUNT + 1))
# Wait when reaching max parallel
if [ $COUNT -ge $MAX_PARALLEL ]; then
wait -n # Wait for any job to finish
COUNT=$((COUNT - 1))
fi
...
Repository
jackspace/claudeskillzParent repository
Repository Stats
Stars8
Forks2
LicenseMIT License