best-practices-learner

from adaptationio/skrillz

No description

1 stars0 forksUpdated Jan 16, 2026
npx skills add https://github.com/adaptationio/skrillz --skill best-practices-learner

SKILL.md

Best Practices Learner

Overview

best-practices-learner systematically extracts learnings from skill development experience and feeds them back into the ecosystem. It transforms individual experiences into documented best practices that improve all future work.

Purpose: Capture and propagate learnings for continuous ecosystem improvement

The 5 Learning Operations:

  1. Extract Patterns - Identify successful patterns from completed skills
  2. Document Learnings - Capture what worked and what didn't
  3. Update Guidelines - Feed learnings into common-patterns.md and templates
  4. Track Evolution - Document how standards and practices evolved
  5. Share Knowledge - Propagate learnings across ecosystem

Key Principle: Every skill built teaches us something - capture those lessons

Continuous Improvement Loop:

Build Skills → Review/Analyze → Extract Learnings →
Update Guidelines → Apply to Future Skills → Build Better Skills → Repeat

When to Use

Use best-practices-learner when:

  1. After Completing Skills - Extract learnings from successful builds
  2. After Reviews - Capture insights from review-multi findings
  3. Pattern Discovery - Document patterns identified through analysis
  4. Process Improvement - Update development process based on experience
  5. Standards Evolution - Document how best practices emerged and changed
  6. Knowledge Capture - Before insights are forgotten, document them
  7. Ecosystem Enhancement - Continuously improve toolkit quality

Operations

Operation 1: Extract Patterns

Purpose: Identify successful patterns from completed skills that should be repeated

When to Use:

  • After completing skill successfully
  • When reviewing multiple skills
  • During retrospectives
  • When patterns emerge

Process:

  1. Review Completed Work

    • What worked well in this skill?
    • What made it easier/faster?
    • What would you repeat?
    • What was particularly effective?
  2. Identify Patterns

    • Structural patterns (file organization, naming)
    • Content patterns (section types, documentation styles)
    • Process patterns (workflow steps, techniques)
    • Quality patterns (validation approaches, examples)
  3. Validate Pattern

    • Is this truly a pattern (recurring, repeatable)?
    • Does it appear in 2+ instances?
    • Is it generalizable (not one-off)?
    • Does it have clear benefit?
  4. Document Pattern

    • Pattern name
    • Description (what it is)
    • Context (when to use)
    • Example (concrete instance)
    • Benefit (why it works)
  5. Categorize Pattern

    • Architecture pattern
    • Documentation pattern
    • Process pattern
    • Quality pattern
    • Integration pattern

Example:

Pattern Extracted: Quick Reference Section

Discovered During: Skills 4-5 (prompt-builder, skill-researcher)

Description: Add Quick Reference section as final section in SKILL.md
with high-density summary tables, commands, cheat sheets

Context: Use in all skills for rapid lookup and memory refresh

Benefit:
- Users can refresh memory without re-reading entire skill
- High-density information (tables, lists)
- Improves user experience
- Easy to find (always last section)

Evidence:
- Skills 4-8 have Quick Reference → received positive implicit feedback
- Skills 1-3 lacked it → identified as improvement in review
- Adding to skills 1-3 brought consistency to 100%

Recommendation: Make Quick Reference mandatory for all future skills

Category: Documentation Pattern

Operation 2: Document Learnings

Purpose: Capture what worked, what didn't, and insights for future improvement

When to Use:

  • End of each skill build
  • After reviews or retrospectives
  • When insights emerge
  • Before forgetting context

Process:

  1. Capture What Worked

    • Techniques that were effective
    • Approaches that saved time
    • Decisions that paid off
    • Tools/skills that helped
  2. Document What Didn't Work

    • Mistakes made
    • Approaches that failed
    • Time wasted on wrong paths
    • What to avoid next time
  3. Extract Insights

    • Why did X work well?
    • Why did Y fail?
    • What was the key factor?
    • What would you do differently?
  4. Quantify Impact (when possible)

    • Time saved/lost
    • Quality improvement
    • Efficiency gain
    • Measurable benefits
  5. Make Actionable

    • What should we do next time?
    • What should we stop doing?
    • What should we start doing?
    • Specific recommendations

Example:

Learning Documented: High-Quality Prompts Save Implementation Time

Context: Skills 4-9 development

What Worked:
- Investing 1.5-2h in prompt-builder to create detailed prompts
- Prompts with specific validation criteria, examples, constraints
- Targeting ≥4.5/5.0 prompt quality

Impact:
- Implementation 30-40% faster when following high-quality prompts
- Less rework (prompts were accurate first time)
- Clearer direction (no ambiguity)

What Didn't

...
Read full content

Repository Stats

Stars1
Forks0