This document provides comprehensive strategies for optimal Claude Code usage, enabling developers to maximize productivity and code quality when building luxury fashion tech platforms or any complex software system.
- Core Principles
- Project Initialization
- Architecture Design
- Iterative Development
- Complex Feature Implementation
- Debugging and Optimization
- Code Review and Refactoring
- Testing Strategies
- Documentation
- Advanced Techniques
Always provide clear, specific context. Claude Code performs best with:
- Explicit requirements: State what you want, not just what you don't want
- Technical constraints: Mention frameworks, libraries, and patterns to use
- Quality standards: Specify code style, testing requirements, and documentation needs
Start simple, then build complexity:
- Begin with core functionality before edge cases
- Add features in logical layers
- Test each layer before moving to the next
Establish clear feedback mechanisms:
- Review generated code before proceeding
- Provide specific corrections when needed
- Acknowledge good solutions to reinforce patterns
Create a [PROJECT_TYPE] for [DOMAIN/INDUSTRY] with the following specifications:
**Technical Stack:**
- Language: [Python 3.11+ / TypeScript / etc.]
- Framework: [FastAPI / Next.js / etc.]
- Database: [PostgreSQL / MongoDB / etc.]
- Additional: [Redis, Celery, etc.]
**Architecture:**
- Pattern: [Clean Architecture / Hexagonal / MVC]
- API Style: [REST / GraphQL / gRPC]
- Authentication: [JWT / OAuth2 / API Keys]
**Code Quality Standards:**
- Type hints/annotations: Required
- Test coverage: Minimum 80%
- Documentation: Docstrings for all public functions
- Linting: [ruff/eslint] with strict settings
**Project Structure:**
[Describe expected directory layout]
**Initial Features:**
1. [Feature 1 - brief description]
2. [Feature 2 - brief description]
3. [Feature 3 - brief description]
Start by creating the project structure and core configuration files.
Then implement features incrementally with tests for each.
Create a luxury fashion e-commerce platform backend with:
**Technical Stack:**
- Python 3.11+ with FastAPI
- PostgreSQL with SQLAlchemy 2.0 (async)
- Redis for caching and sessions
- Celery for background tasks
**Architecture:**
- Clean Architecture with service layer
- RESTful API with OpenAPI documentation
- JWT authentication with refresh tokens
**Code Quality:**
- Full type hints with mypy strict mode
- 90% test coverage with pytest
- Structured logging with structlog
- Comprehensive error handling
**Initial Features:**
1. User authentication with MFA support
2. Product catalog with variants
3. Order processing with payment integration
Implement using domain-driven design principles.
Include proper validation, error handling, and logging throughout.
I need to design [COMPONENT/SYSTEM] that:
**Requirements:**
- [Functional requirement 1]
- [Functional requirement 2]
- [Non-functional requirement 1]
**Constraints:**
- [Technical constraint]
- [Business constraint]
- [Performance requirement]
**Integration Points:**
- [System A] via [protocol]
- [System B] via [protocol]
Please propose 2-3 architectural approaches with:
1. High-level design
2. Pros and cons
3. Trade-offs
4. Recommended approach with rationale
Design the data model for [DOMAIN] with these entities:
**Entities:**
1. [Entity A]
- Key attributes: [list]
- Relationships: [describe]
2. [Entity B]
- Key attributes: [list]
- Relationships: [describe]
**Considerations:**
- Expected data volume: [estimate]
- Query patterns: [describe common queries]
- Consistency requirements: [describe]
Generate SQLAlchemy models with:
- Proper relationships and foreign keys
- Indexes for common queries
- Soft delete where appropriate
- Audit fields (created_at, updated_at)
Implement [FEATURE_NAME] following this flow:
**Step 1: Interface Design**
Create the public API/interface for this feature.
Define input/output schemas with validation.
**Step 2: Core Logic**
Implement the main business logic.
Keep it framework-agnostic where possible.
**Step 3: Integration**
Connect to database/external services.
Add proper error handling.
**Step 4: Testing**
Write unit tests for core logic.
Write integration tests for full flow.
**Step 5: Documentation**
Add docstrings and update API docs.
Include usage examples.
Start with Step 1, then wait for my feedback before proceeding.
Enhance the existing [COMPONENT] to support:
**New Capability:**
[Describe the enhancement]
**Constraints:**
- Maintain backward compatibility
- No breaking changes to existing API
- Preserve existing tests
**Approach:**
1. First, analyze the current implementation
2. Propose the enhancement approach
3. Implement with minimal changes
4. Add tests for new functionality
5. Update documentation
Show me the analysis first, then we'll proceed with implementation.
Implement [COMPLEX_FEATURE] which spans:
**Components Involved:**
1. [Component A] - [responsibility]
2. [Component B] - [responsibility]
3. [Component C] - [responsibility]
**Data Flow:**
[Describe how data flows between components]
**Error Scenarios:**
- [Scenario 1]: [Expected handling]
- [Scenario 2]: [Expected handling]
**Transaction Boundaries:**
[Describe where transactions should begin/end]
**Implementation Order:**
1. Start with [component] because [reason]
2. Then [component] because [reason]
3. Finally [component] because [reason]
Implement each component with its tests before moving to the next.
Ensure proper error propagation between components.
Integrate [AI_SERVICE] for [USE_CASE]:
**Service Details:**
- Provider: [Anthropic/OpenAI/etc.]
- Model: [specific model]
- Purpose: [what it does]
**Integration Requirements:**
- Async operation support
- Retry logic with exponential backoff
- Cost optimization (caching, batching)
- Fallback behavior when unavailable
**Input/Output:**
- Input format: [describe]
- Output format: [describe]
- Validation requirements: [describe]
**Testing Strategy:**
- Mock service for unit tests
- Integration tests with real service
- Performance benchmarks
Create an abstraction layer that allows swapping providers.
Include rate limiting and cost tracking.
Debug this issue:
**Observed Behavior:**
[Describe what's happening]
**Expected Behavior:**
[Describe what should happen]
**Context:**
- When it occurs: [conditions]
- Error messages: [paste errors]
- Relevant code: [reference files/functions]
**Investigation Steps Completed:**
1. [What you've already tried]
2. [What you've ruled out]
Please:
1. Analyze the possible root causes
2. Suggest diagnostic steps
3. Propose and implement the fix
4. Add tests to prevent regression
Optimize [COMPONENT/QUERY/FUNCTION] for better performance:
**Current State:**
- Response time: [current metric]
- Resource usage: [memory/CPU]
- Bottleneck: [identified issue]
**Target:**
- Response time: [target metric]
- Acceptable trade-offs: [what can be sacrificed]
**Constraints:**
- Must maintain [specific behavior]
- Cannot change [specific aspect]
Analyze the current implementation, propose optimizations with expected impact, then implement the most effective solution. Include before/after benchmarks.
Review the following code for:
**Code Location:**
[file paths or paste code]
**Review Focus:**
- [ ] Correctness and edge cases
- [ ] Security vulnerabilities
- [ ] Performance issues
- [ ] Code style and readability
- [ ] Test coverage
- [ ] Documentation completeness
**Context:**
[Why this code was written, what it does]
Provide:
1. Critical issues that must be fixed
2. Improvements that should be made
3. Suggestions for enhancement
4. Positive observations (what's done well)
Refactor [COMPONENT] to improve [ASPECT]:
**Current Issues:**
1. [Issue 1]
2. [Issue 2]
3. [Issue 3]
**Goals:**
- [Goal 1]
- [Goal 2]
**Constraints:**
- All existing tests must pass
- Public API cannot change
- [Other constraints]
**Approach:**
1. Create a refactoring plan
2. Implement changes incrementally
3. Verify tests pass after each step
4. Update documentation
Start with the plan for my review.
Create a comprehensive test suite for [COMPONENT]:
**Test Categories:**
1. **Unit Tests**
- Test individual functions/methods
- Mock external dependencies
- Cover edge cases and error paths
2. **Integration Tests**
- Test component interactions
- Use test database
- Verify data persistence
3. **API Tests** (if applicable)
- Test endpoint behavior
- Verify response schemas
- Test authentication/authorization
**Coverage Requirements:**
- Minimum 90% line coverage
- 100% coverage for critical paths
- All error scenarios tested
**Test Data:**
- Use factories for consistent test data
- Include edge cases in fixtures
- Parameterize where appropriate
Generate tests following pytest best practices.
Use descriptive test names that explain the scenario.
Implement [FEATURE] using TDD:
**Requirements:**
[List specific requirements]
**Approach:**
1. Write failing test for first requirement
2. Implement minimum code to pass
3. Refactor if needed
4. Repeat for each requirement
**Test Structure:**
- Arrange: Setup test data
- Act: Execute the function
- Assert: Verify results
Show me each test before implementation.
Red -> Green -> Refactor cycle.
Document the [ENDPOINT/API]:
**Include:**
1. **Endpoint description**: What it does
2. **Authentication**: Required auth method
3. **Request format**:
- Method and path
- Headers
- Query parameters
- Request body schema with examples
4. **Response format**:
- Success response with schema
- Error responses with codes
5. **Usage examples**: curl/code examples
6. **Rate limits**: If applicable
7. **Changelog**: Version history
Generate OpenAPI-compatible documentation.
Document the architecture of [SYSTEM/COMPONENT]:
**Include:**
1. **Overview**: High-level purpose
2. **Components**: List and describe each
3. **Data Flow**: How data moves through system
4. **Interactions**: Component dependencies
5. **Design Decisions**: Key choices and rationale
6. **Diagrams**: ASCII or Mermaid diagrams
7. **Configuration**: Required settings
8. **Deployment**: How to deploy
9. **Monitoring**: Key metrics and alerts
Format for technical readers (developers, architects).
Solve [PROBLEM] by thinking through it step by step:
**Problem:**
[Describe the problem]
**Instructions:**
1. First, restate the problem in your own words
2. Identify the key challenges
3. Consider multiple approaches
4. Evaluate trade-offs of each approach
5. Choose the best approach and explain why
6. Implement the solution
7. Verify it solves the problem
Show your reasoning at each step.
Compare [OPTION_A] vs [OPTION_B] for [USE_CASE]:
**Evaluation Criteria:**
1. [Criterion 1]: Weight [X]%
2. [Criterion 2]: Weight [X]%
3. [Criterion 3]: Weight [X]%
**For each option, analyze:**
- How well it meets each criterion
- Strengths and weaknesses
- Implementation complexity
- Long-term maintenance
**Deliverable:**
- Comparison matrix
- Recommendation with justification
- Implementation plan for recommended option
Approach this task with multiple perspectives:
**Architect Perspective:**
- Focus on system design and patterns
- Consider scalability and maintainability
**Security Perspective:**
- Identify potential vulnerabilities
- Ensure secure implementation
**Performance Perspective:**
- Identify bottlenecks
- Suggest optimizations
**User Perspective:**
- Consider API usability
- Focus on developer experience
Synthesize all perspectives into a cohesive implementation plan.
- Be specific - Vague prompts yield vague results
- Provide context - Include relevant background information
- Set constraints - Define boundaries and requirements
- Request explanation - Ask for rationale behind decisions
- Iterate - Build on previous responses
- Verify - Always review and test generated code
- Use examples - Show the format you expect
- Break down complexity - Tackle large tasks in steps
- Don't assume - Explicitly state your assumptions
- Don't skip testing - Always request and run tests
- Don't ignore errors - Address issues immediately
- Don't rush - Quality over speed
- Don't over-engineer - Start simple, add complexity as needed
- Don't forget docs - Documentation is part of the deliverable
1. "Design the API contract for [feature]"
→ Review and approve design
2. "Implement the service layer with tests"
→ Review implementation
3. "Add API endpoints connecting to the service"
→ Test endpoints
4. "Add error handling and edge cases"
→ Verify comprehensive handling
5. "Document the feature and add to changelog"
→ Review documentation
1. "Analyze this error and potential causes: [error]"
→ Understand the issue
2. "Write a test that reproduces this bug"
→ Confirm reproduction
3. "Implement the fix for this bug"
→ Review fix
4. "Verify the test passes and no regression"
→ Confirm fix works
5. "Update documentation if behavior changed"
→ Complete fix
Effective Claude Code usage requires:
- Clear communication of requirements
- Iterative refinement of solutions
- Consistent quality standards
- Regular verification and testing
Master these prompting strategies to build robust, maintainable, and high-quality software efficiently.