Spaces:
Running
A newer version of the Gradio SDK is available:
5.49.1
Coding Guidelines
Code Length and Structure Guidelines
- Reuse code blocks whenever possible. If similar functionality exists in previously generated files within this project, reference and extend that code rather than rewriting from scratch. Build incrementally on existing code patterns.
- Do not fallback anywhere. Raise errors and terminate program rather than silently falling back to default values. Always require explicit configuration values rather than silently using defaults.
Code Review and Testing Guidelines
Core Principles
- Review-Only Mode: When conducting code reviews, analyze and provide feedback without modifying the existing code
- Isolated Testing: Write standalone, focused tests for individual components rather than executing the entire codebase
- Documentation: Document all findings systematically for future reference
Review Process
Analysis Phase
Examine code structure, logic, and patterns
Identify potential issues, bugs, or areas for improvement
Assess code quality, readability, and maintainability
Mathematical Verification: If the code implements mathematical derivations or formulas, carefully verify that the implementation faithfully represents the mathematical concepts, including:
- Correct formula translation
- Proper handling of edge cases and numerical stability
- Appropriate precision and rounding considerations
- Accurate implementation of mathematical operations and their order
Performance Considerations
- Identify computational bottlenecks
- Review memory usage patterns
Testing Phase
Test Strategy
- Create minimal, isolated test cases for specific functions or modules
- Focus on unit tests that validate individual pieces of functionality
- Avoid running the full application unless explicitly necessary
Test Coverage
- Ensure critical paths are tested
- Include edge cases and boundary conditions
- Test error handling scenarios
- Validate expected exceptions are raised
Test Organization
- Place all tests in
tests/directory - Follow naming convention:
test_<module_name>.py - Group related tests in test classes
- Use descriptive test method names
- Place all tests in
Documentation Phase
- Summarize findings in a markdown file within
Development/CodeReview/ - Use descriptive filenames:
<component>-review-YYYY-MM-DD.md - Important: Never overwrite existing review documents; always create new files with unique names
- Summarize findings in a markdown file within
Planing Guideline
Sometimes you will be asked to assist with planning. In such cases, first complete the comprehensive analysis outlined below. Do not revise any existing code until you are explicitly directed to do so.
1. Requirements Analysis
- Understand the Goal
- Clearly define what needs to be achieved
- Identify success criteria and metrics
- List functional and non-functional requirements
2. Design Phase
- Break down into manageable tasks
- Define clear, step-to-step milestones, but no need to estimate time requirements.
- Identify potential risks and mitigation strategies
- Important: in full-scale training, I will use a cluster without Internet connection. Please take this into account, e.g., when using tools like wandb.
3. Documentation
File Organization
- Create plan documents in
Development/Plan/ - Use descriptive filenames:
<feature>-plan-YYYY-MM-DD.md - Never overwrite existing plans; create new versions
- Create plan documents in
Plan Document Structure
# Implementation Plan: [Feature Name] Date: YYYY-MM-DD Author: [Name/Role] ## Objective [Clear statement of what will be built] ## Background & Research [Findings from research phase] ## Technical Approach ### Architecture Overview [High-level design] ### Step-by-Step Implementation 1. [Step with sample code] 2. [Step with sample code] ### Sample Code ## Dependencies - [Required libraries/modules] ## Risk Assessment - [Potential issues and mitigations] ## Success Criteria - [How to verify implementation]
Debugging Guideline
Occasionally, you will receive a bug report and be asked to assist with debugging. In such cases, first complete the comprehensive analysis outlined below. Do not revise any existing code until you are explicitly directed to do so.
1. Problem Analysis
Issue Documentation
- Record exact error messages and stack traces
- Document reproduction steps
Root Cause Investigation
- Form hypotheses about potential causes
- Identify affected components
2. Debugging Strategy
Isolation Approach
- Create minimal reproducible examples
- Isolate the problem to specific components
Diagnostic Tools (if necessary)
- Add strategic logging statements
- Use debugger breakpoints effectively
- Employ profiling tools for performance issues
- Utilize memory analysis for leak detection
3. Testing & Validation
- Test Creation
- Write tests that reproduce the bug in
tests/debug/ - Create tests that verify the fix
- Ensure no new issues are introduced
- Test edge cases around the fix
- Write tests that reproduce the bug in
4. Documentation
File Organization
- Create debug documents in
Development/Debug/ - Use descriptive filenames:
<issue>-debug-YYYY-MM-DD.md - Link to related test files
- Create debug documents in
Debug Document Structure
# Debug Report: [Issue Description] Date: YYYY-MM-DD ## Problem Statement [Clear description of the issue] ## Symptoms - [Observable behaviors] - [Error messages] ## Reproduction Steps 1. [Step-by-step instructions] ## Investigation Process ### Hypotheses - [Potential causes considered] ### Tests Performed - [Diagnostic steps taken] ### Findings - [Root cause identified] ## Solution Approach ### Proposed Fix [Description of the solution] ### Sample Code ## Validation - [Tests created in tests/debug/] - [Verification results] ## Prevention Recommendations - [How to avoid similar issues]