Session Reflection Guide
Learn from your AI coding sessions by capturing what worked, what didn't, and patterns to reuse.
Purpose
Session reflection is meta-analysis - focusing not on what was built, but on HOW the work was done:
- Capture effective patterns - What approaches worked well?
- Document lessons learned - What would you do differently?
- Build personal knowledge - Accumulate tips and tricks over time
When to Reflect
- After complex multi-step tasks
- When discovering new effective patterns
- After significant debugging sessions
- When a session went particularly well (or badly)
- Before ending a long work session
Using the Reflect Prompt
This repo includes a /reflect prompt for generating structured reflections:
/reflect api-migration
Or let it auto-derive a name:
/reflect
The prompt creates a markdown file at .copilot/reflections/<date>-<name>.md.
Reflection Structure
What Went Well
Identify effective techniques:
- Problem-solving approaches that worked
- Tool usage that was particularly helpful
- Communication patterns that produced good results
- Planning strategies that paid off
What Went Wrong
Document issues for future avoidance:
- Approaches that didn't work or caused delays
- Tool misuse or inefficiency
- Unnecessary iteration or backtracking
- Context management problems
Lessons Learned
Extract actionable insights:
- Each lesson should inform future behavior
- Be specific about what to do differently
- Include the "why" behind the lesson
Tips & Tricks
Capture useful patterns discovered:
- Specific prompting techniques
- Workflow shortcuts
- Configuration discoveries
- Tool combinations that work well
Example Reflection
# Session Reflection: API Migration
**Date**: 2026-02-05
**Session Goal**: Migrate REST API endpoints to new versioned structure
---
## What Went Well
- Breaking migration into phases (routes → handlers → tests) kept scope manageable
- Using Agent mode for bulk file moves with pattern matching
- Keeping a checklist in _plans/ prevented missing endpoints
## What Went Wrong
- Started without checking existing test coverage - had to backtrack
- Tried to migrate everything in one conversation - context got confused
- Forgot to update API documentation until the end
## Lessons Learned
1. **Check test coverage first**: Before any refactor, know what's tested. Saves debugging time.
2. **One conversation per phase**: Don't mix migration phases in same context.
## Action Items
- [ ] Add "check test coverage" to personal migration checklist
- [ ] Create prompt template for API migrations
## Tips & Tricks
- **Tip**: Use `@workspace find all routes` to get comprehensive endpoint list
- **Tip**: Agent mode + explicit file patterns = reliable bulk operations
---
*Generated by `/reflect` prompt*
Building a Knowledge Base
Over time, reflections accumulate into a personal knowledge base:
.copilot/reflections/
2026-02-01-auth-debugging.md
2026-02-03-test-refactoring.md
2026-02-05-api-migration.md
Periodically review reflections to:
- Identify recurring patterns (good and bad)
- Extract tips into instruction files
- Create prompts for common workflows
Generalization Opportunities
If a reflection reveals a repeatable pattern, consider creating:
| Pattern Type | Create |
|---|---|
| Repeated workflow | Custom prompt |
| Specialized task | Agent configuration |
| Multi-step process | Skill with reference files |
| Common instructions | Add to instruction files |
Only generalize patterns that genuinely recur across multiple contexts. Premature abstraction adds complexity without benefit.
Summary
| Step | Action |
|---|---|
| Trigger | End of significant session |
| Generate | Use /reflect prompt |
| Review | Read and refine the output |
| Store | Keep in .copilot/reflections/ |
| Apply | Use insights in future sessions |
| Generalize | Create prompts/instructions from patterns |