Master systematic debugging with Claude Code for faster problem resolution and learning
Transform debugging from frustration to systematic investigation: Use Claude's analytical capabilities to debug more efficiently, understand root causes, and prevent similar issues in the future.
- Evidence-Based: Gather comprehensive data before theorizing
- Systematic Approach: Use structured debugging methodologies
- Root Cause Focus: Fix underlying causes, not just symptoms
- Learning Mindset: Extract knowledge from every debugging session
- Documentation: Capture insights for future reference
# First 5 minutes: Quick diagnostic assessment
> "Help me quickly diagnose this issue:
**Problem Summary:**
- What should happen: [Expected behavior]
- What actually happens: [Actual behavior]
- When it started: [Timeline]
- Frequency: [Always/Sometimes/Specific conditions]
**Quick Checks:**
1. **Recent Changes**: What changed in the last 24-48 hours?
2. **Environment**: Development/staging/production differences?
3. **Error Messages**: Any error messages or stack traces?
4. **Reproducibility**: Can you reproduce it consistently?
5. **Scope**: Is it isolated or affecting multiple areas?
**Immediate Actions Needed:**
- Is this blocking other work? (Priority level)
- Do we need to rollback any changes?
- Should we notify stakeholders?
- What information do we need to collect?
Provide initial diagnostic direction and next steps."# Critical issues requiring immediate attention
> "This is a critical production issue. Help me with emergency response:
**Critical Issue Protocol:**
1. **Assess Impact**: How many users/systems affected?
2. **Immediate Mitigation**: Can we quickly reduce impact?
3. **Communication**: Who needs to be notified immediately?
4. **Evidence Preservation**: What data should we collect now?
5. **Rollback Decision**: Should we rollback recent deployments?
**Current Status:**
- Issue description: [Detailed problem description]
- Systems affected: [List affected systems]
- Error evidence: [Stack traces, logs, metrics]
- Recent changes: [Deployments, configuration changes]
**Required Actions:**
- Immediate stabilization steps
- Evidence collection procedures
- Communication plan
- Investigation approach after stabilization
Focus on stopping the bleeding first, then systematic investigation."# Debug context-related problems
> "Claude seems confused about my project. Debug context issues:
**Context Diagnostic Questions:**
1. **Context Freshness**: When was CLAUDE.md last updated?
2. **Context Accuracy**: Does CLAUDE.md reflect current project state?
3. **Context Completeness**: Are there missing patterns or requirements?
4. **Context Conflicts**: Any contradictory information?
**Debug Steps:**
1. **Review CLAUDE.md**: Check for outdated or incorrect information
2. **Compare with Codebase**: Does context match actual code patterns?
3. **Test Context Loading**: `/init` and verify Claude understands project
4. **Context Validation**: Ask Claude to summarize project understanding
**Common Context Issues:**
- Outdated technology stack information
- Missing or incorrect coding standards
- Conflicting architectural guidance
- Generic rather than project-specific context
**Quick Fixes:**
- Update CLAUDE.md with current project state
- Add specific code examples and patterns
- Remove contradictory or outdated information
- Test context with sample prompts"# Debug Claude Code configuration issues
> "Claude Code isn't working properly. Debug configuration:
**Configuration Diagnostics:**
```bash
# Check Claude Code installation
claude --version
claude --help
# Verify authentication
claude auth --status
# Check permissions
claude --show-permissions
# Validate project setup
ls -la .claude/
cat .claude/settings.jsonCommon Configuration Issues:
-
Authentication Problems:
- Expired or invalid API keys
- Network connectivity issues
- Proxy or firewall blocking
-
Permission Issues:
- Restrictive file access permissions
- Missing commands in allowlist
- Security policies blocking operations
-
Project Setup Issues:
- Missing .claude directory
- Corrupted settings files
- Incorrect file permissions
Resolution Steps:
- Re-authenticate with fresh API key
- Update permission settings appropriately
- Recreate project configuration if needed
- Test with minimal configuration first"
### Performance and Timeout Issues
#### Slow Response Debugging
```bash
# Debug performance and timeout issues
> "Claude Code is responding very slowly. Debug performance issues:
**Performance Diagnostic Process:**
**1. Response Time Analysis:**
- Measure actual response times for different request types
- Compare with baseline performance metrics
- Identify if slowness is consistent or intermittent
- Check if specific operations are slower than others
**2. Context Size Investigation:**
```bash
# Check context size
wc -w CLAUDE.md # Word count
du -h CLAUDE.md # File size
# Analyze session context
> /compact # Try compacting session
> "How large is our current context?"
3. Network and Connectivity:
# Test network connectivity
ping api.anthropic.com
curl -I https://api.anthropic.com/v1/health
# Check for proxy or VPN issues
echo $HTTP_PROXY
echo $HTTPS_PROXY4. Model Selection Review:
- Are you using appropriate model for task complexity?
- Opus 4 for complex tasks, Sonnet 4 for routine work
- Consider switching models for performance testing
Performance Optimization Actions:
- Reduce CLAUDE.md size by removing outdated content
- Use /compact more frequently in long sessions
- Switch to faster model for appropriate tasks
- Check network connectivity and proxy settings
- Break large requests into smaller parts"
### Code Quality Issues
#### Generated Code Problems
```bash
# Debug poor code quality from Claude
> "Claude generated code that works but has quality issues. Debug and improve:
**Code Quality Assessment:**
**1. Quality Issues Analysis:**
- What specific quality problems exist?
- Security vulnerabilities or concerns?
- Performance issues or inefficiencies?
- Maintainability and readability problems?
- Missing error handling or edge cases?
**2. Context Review:**
- Does CLAUDE.md include quality standards?
- Are there code examples showing quality patterns?
- Is the context specific enough about requirements?
- Are quality gates and validation mentioned?
**3. Prompt Analysis:**
- Was the prompt specific about quality requirements?
- Were examples provided of desired quality?
- Was the scope and complexity appropriate?
- Were constraints and standards mentioned?
**Improvement Strategy:**
```typescript
// Instead of vague requests
❌ "Create a user service"
// Use specific quality requirements
✅ "Create a user service with these quality requirements:
- TypeScript strict mode with explicit types
- Comprehensive error handling with structured errors
- Input validation using Joi schemas
- Unit tests with >90% coverage
- JSDoc documentation for all public methods
- Performance: all operations under 100ms
- Security: bcrypt for passwords, rate limiting for APIs
- Follow patterns from existing UserController.ts"
Quality Improvement Actions:
- Add specific quality requirements to all prompts
- Update CLAUDE.md with detailed quality standards
- Include code examples showing quality patterns
- Use iterative improvement approach
- Implement code review processes for AI-generated code"
### Integration and Workflow Issues
#### Git and Version Control Problems
```bash
# Debug git integration and workflow issues
> "Having problems with git integration in Claude Code. Debug workflow issues:
**Git Integration Diagnostics:**
**1. Permission Check:**
```json
// Check .claude/settings.json for git permissions
{
"allowedCommands": [
"Bash(git status)",
"Bash(git diff)",
"Bash(git add .)",
"Bash(git commit -m '*')",
"Bash(git push)",
"Bash(git pull)"
]
}
2. Git Configuration:
# Check git setup
git config --list
git status
git remote -v
# Test git operations
git log --oneline -5
git diff --stat3. Working Directory Issues:
- Are you in the correct git repository?
- Are there uncommitted changes causing conflicts?
- Is the git repository in a clean state?
- Are there any git hooks causing issues?
Common Git Issues:
- Permission Denied: Git commands not in allowed list
- Authentication: Git credentials expired or incorrect
- Repository State: Merge conflicts or detached HEAD
- File Permissions: Files not accessible to git operations
Resolution Steps:
- Add necessary git commands to allowed permissions
- Update git credentials and authentication
- Resolve any git conflicts or issues manually
- Use Claude for git command generation when direct execution fails"
## 🔧 Systematic Debugging Process
### Structured Debugging Methodology
#### The CLAUDE Debugging Method
```bash
# Use systematic CLAUDE debugging methodology
> "Apply the CLAUDE debugging method to this issue:
**C - Collect Evidence:**
- Error messages and stack traces
- Log files and system output
- Reproduction steps and conditions
- System state and configuration
- Recent changes and timeline
**L - List Hypotheses:**
- Generate 3-5 potential root causes
- Rank by likelihood and testability
- Consider both obvious and subtle possibilities
- Include environmental and configuration factors
**A - Analyze Each Hypothesis:**
- Design specific tests to validate/invalidate
- Consider evidence supporting or contradicting
- Assess probability and impact of each cause
- Plan testing approach for each hypothesis
**U - Understand Root Cause:**
- Test hypotheses systematically
- Eliminate impossible causes
- Focus on most likely remaining causes
- Understand underlying mechanism
**D - Determine Solution:**
- Address root cause, not just symptoms
- Consider multiple solution approaches
- Plan implementation and testing
- Implement monitoring to prevent recurrence
- Document findings and solutions
**E - Evaluate and Learn:**
- Verify fix resolves original problem
- Assess for side effects or new issues
- Document lessons learned
- Improve processes to prevent similar issues
- Share knowledge with team
Apply this method systematically to understand and resolve the issue."
# Standardized evidence collection
> "Help me collect comprehensive debugging evidence:
**Error Evidence Collection:**
```bash
# Capture error details
echo "=== Error Information ===" > debug-report.txt
date >> debug-report.txt
echo "Issue: [DESCRIPTION]" >> debug-report.txt
echo >> debug-report.txt
# System information
echo "=== System Information ===" >> debug-report.txt
node --version >> debug-report.txt
npm --version >> debug-report.txt
git --version >> debug-report.txt
echo "OS: $(uname -a)" >> debug-report.txt
echo >> debug-report.txt
# Project information
echo "=== Project Information ===" >> debug-report.txt
pwd >> debug-report.txt
git status >> debug-report.txt
git log --oneline -5 >> debug-report.txt
echo >> debug-report.txt
# Error logs
echo "=== Error Logs ===" >> debug-report.txt
tail -50 logs/error.log >> debug-report.txt 2>/dev/null || echo "No error logs found" >> debug-report.txt
echo >> debug-report.txt
# Configuration
echo "=== Configuration ===" >> debug-report.txt
cat .claude/settings.json >> debug-report.txt 2>/dev/null || echo "No Claude settings found" >> debug-report.txtEnvironment Evidence:
- Development vs production differences
- Environment variables and configuration
- Network connectivity and proxy settings
- Installed packages and versions
- System resources and performance metrics
Code Evidence:
- Recent code changes and commits
- Code patterns and structure
- Dependencies and integrations
- Test results and coverage
- Code quality metrics
User Evidence:
- Steps to reproduce the issue
- Expected vs actual behavior
- Frequency and conditions
- Impact on workflow and productivity
- Workarounds currently in use"
### Advanced Debugging Techniques
#### Interactive Debugging Sessions
```bash
# Conduct interactive debugging with Claude
> "Let's debug this issue interactively. Guide me through systematic investigation:
**Interactive Debugging Protocol:**
**Phase 1: Issue Isolation**
- Let's isolate the issue to the smallest reproducible case
- What's the minimal setup that demonstrates the problem?
- Can we eliminate external factors and dependencies?
- Is this issue related to specific inputs or conditions?
**Phase 2: Hypothesis Testing**
- Based on the evidence, what are your top 3 theories?
- Let's test the most likely hypothesis first
- What experiment can we run to validate/eliminate this theory?
- What would we expect to see if this hypothesis is correct?
**Phase 3: Progressive Investigation**
- If first hypothesis doesn't pan out, let's try the second
- Are there any new clues from our testing so far?
- Should we gather additional evidence or change our approach?
- Are we missing any important information or context?
**Phase 4: Solution Validation**
- Once we identify the root cause, let's design a fix
- How can we test that our solution works?
- What side effects or edge cases should we consider?
- How can we prevent this issue from recurring?
**Interactive Rules:**
- Ask clarifying questions when you need more information
- Suggest specific commands or tests to run
- Explain your reasoning for each hypothesis
- Guide me through logical debugging steps
- Help me learn debugging techniques for future issues
Let's start with Phase 1: Can you help me isolate this issue?"
# Debug with team collaboration
> "This is a complex issue that needs team collaboration. Help coordinate debugging effort:
**Team Debugging Coordination:**
**1. Role Assignment:**
- **Debug Lead**: Coordinates investigation and maintains timeline
- **Domain Expert**: Provides specialized knowledge of affected system
- **Evidence Collector**: Gathers logs, metrics, and reproduction steps
- **Test Engineer**: Designs and runs validation tests
- **Communication Lead**: Updates stakeholders and documents findings
**2. Information Sharing:**
- Central document for collecting all evidence and findings
- Real-time communication channel for coordination
- Shared screen sessions for collaborative investigation
- Regular status updates and progress reports
**3. Parallel Investigation:**
- Multiple team members can investigate different hypotheses
- Divide system components among team members
- Share findings and insights continuously
- Avoid duplicate work through coordination
**4. Knowledge Capture:**
- Document all hypotheses tested and results
- Capture debugging techniques that worked well
- Record lessons learned and improvement opportunities
- Create post-mortem report for future reference
**Collaboration Tools:**
- Shared debugging document template
- Screen sharing and remote access tools
- Chat or communication channels
- Issue tracking and progress management
- Knowledge base for similar issues
Help me set up effective team debugging coordination for this complex issue."# Conduct debugging retrospective for learning
> "Help me conduct a debugging retrospective to extract maximum learning:
**Debugging Retrospective Framework:**
**1. Issue Analysis:**
- What was the root cause of the issue?
- How long did it take to identify and resolve?
- What was the impact on users/system/team?
- Could this issue have been prevented?
**2. Process Analysis:**
- What debugging approaches worked well?
- What techniques were ineffective or misleading?
- Where did we waste time or go down wrong paths?
- What information would have helped earlier?
**3. Tool and Technique Evaluation:**
- Which debugging tools were most helpful?
- What Claude Code features aided the investigation?
- Were there better tools or approaches available?
- What new techniques did we learn?
**4. Knowledge Gaps Identified:**
- What domain knowledge was missing?
- Which debugging skills need improvement?
- What system understanding gaps exist?
- Where do we need better documentation?
**5. Prevention Opportunities:**
- What monitoring could have detected this earlier?
- What tests could have prevented this issue?
- What documentation updates are needed?
- What process improvements should we make?
**6. Future Improvements:**
- Better debugging tools or setup needed?
- Team skills development priorities?
- System architecture improvements?
- Process or workflow enhancements?
**Action Items:**
- Specific improvements to implement
- Knowledge sharing sessions to conduct
- Documentation to update
- Monitoring or alerting to add
- Skills training to pursue
Help me analyze this debugging session systematically to maximize learning and improvement."# Develop systematic debugging expertise with Claude
> "Help me build systematic debugging expertise using Claude Code:
**Debugging Skill Development Plan:**
**1. Fundamental Debugging Skills:**
- Evidence collection and analysis techniques
- Hypothesis generation and testing methods
- Root cause analysis methodologies
- Problem isolation and reproduction skills
- Tool usage and debugging environment setup
**2. Claude-Assisted Debugging Patterns:**
- Effective prompts for debugging assistance
- Using Claude for hypothesis generation
- Collaborative debugging with AI assistance
- Documentation and knowledge capture with Claude
- Learning from Claude's analytical approaches
**3. Domain-Specific Debugging:**
- Frontend debugging (React, JavaScript, CSS)
- Backend debugging (APIs, databases, services)
- Infrastructure debugging (deployment, networking)
- Performance debugging (optimization, profiling)
- Security debugging (vulnerabilities, compliance)
**4. Advanced Debugging Techniques:**
- Distributed system debugging across services
- Production debugging with limited access
- Performance debugging under load
- Intermittent issue investigation
- Legacy system debugging without documentation
**Skill Development Activities:**
- Practice debugging exercises with increasing complexity
- Document debugging patterns and successful techniques
- Review and learn from team debugging sessions
- Study debugging case studies and post-mortems
- Experiment with new debugging tools and approaches
**Success Metrics:**
- Reduced time to identify root causes
- Improved accuracy in hypothesis generation
- Better documentation and knowledge sharing
- Increased confidence in debugging complex issues
- Enhanced ability to prevent similar issues
Help me create a personalized debugging skill development plan."# Create comprehensive debugging knowledge base
> "Help me build a team debugging knowledge base with Claude:
**Knowledge Base Structure:**
**1. Common Issues Database:**
- Categorized list of frequent issues and solutions
- Search functionality by symptoms or error messages
- Solution templates and step-by-step guides
- Links to related documentation and resources
**2. Debugging Techniques Library:**
- Systematic debugging methodologies
- Tool-specific debugging guides
- Domain-specific debugging approaches
- Troubleshooting decision trees and flowcharts
**3. Case Studies and Post-Mortems:**
- Detailed analysis of complex debugging sessions
- Lessons learned from critical issues
- Success stories and effective techniques
- Failure analysis and improvement opportunities
**4. Tool and Resource Directory:**
- Debugging tools and their use cases
- Setup guides and configuration templates
- Integration with development workflows
- Performance and effectiveness evaluations
**5. Training Materials:**
- Debugging skill development curricula
- Hands-on exercises and practice scenarios
- Video tutorials and walkthroughs
- Assessment and skill validation methods
**Knowledge Base Features:**
- Easy search and navigation
- Regular updates and maintenance
- Community contributions and feedback
- Integration with development tools
- Metrics and usage analytics
**Content Development Process:**
- Extract knowledge from debugging sessions
- Document patterns and recurring solutions
- Create templates and standardized approaches
- Regular review and update cycles
- Team contribution and collaboration
Help me design and populate an effective debugging knowledge base that grows with our team's experience."Pro Tip: Great debugging with Claude combines systematic methodology with AI analytical power. Always gather evidence first, use Claude to generate and test hypotheses, and document learnings for future issues. The goal is not just to fix problems, but to become a better debugger and prevent similar issues.