- Add complete pytest testing framework with conftest.py and test files - Add performance monitoring and benchmarking capabilities - Add plugin system with ccache plugin example - Add comprehensive documentation (API, deployment, testing, etc.) - Add Docker API wrapper for service deployment - Add advanced configuration examples - Remove old wget package file - Update core modules with enhanced functionality
15 KiB
Deb-Mock Performance Monitoring and Optimization
Overview
The deb-mock performance monitoring and optimization system provides comprehensive insights into build performance, automatic optimization suggestions, and detailed performance analytics. This system enables users to identify bottlenecks, optimize build configurations, and maintain optimal performance across different build environments.
Features
- Real-time Performance Monitoring - Track CPU, memory, disk I/O, and network usage during builds
- Build Profiling - Detailed analysis of each build phase with performance metrics
- Automatic Optimization - AI-driven suggestions for improving build performance
- Benchmarking - Multi-iteration performance testing for accurate measurements
- Performance Reporting - Comprehensive reports and visualizations
- Resource Utilization Analysis - Identify resource bottlenecks and optimization opportunities
- Cache Performance Tracking - Monitor cache hit rates and effectiveness
- Automatic Tuning - Apply optimization recommendations automatically
Architecture
Core Components
- PerformanceMonitor - Real-time monitoring and metrics collection
- PerformanceOptimizer - Analysis and optimization recommendations
- PerformanceReporter - Report generation and data export
- Build Profiles - Detailed performance tracking for individual builds
- System Monitoring - Background system resource tracking
Data Flow
Build Operation → Performance Monitor → Metrics Collection → Build Profile → Analysis → Optimization Suggestions
Configuration
Performance Monitoring Settings
# Performance monitoring configuration
enable_performance_monitoring: true
performance_metrics_dir: "./performance-metrics"
performance_retention_days: 30
performance_auto_optimization: false
performance_benchmark_iterations: 3
performance_reporting: true
Configuration Options
enable_performance_monitoring- Enable/disable performance monitoring (default: true)performance_metrics_dir- Directory for storing performance data (default: "./performance-metrics")performance_retention_days- How long to keep performance data (default: 30)performance_auto_optimization- Automatically apply optimization suggestions (default: false)performance_benchmark_iterations- Number of iterations for benchmarking (default: 3)performance_reporting- Enable performance reporting (default: true)
Usage
CLI Commands
Performance Summary
# Show overall performance statistics
deb-mock performance-summary
Output Example:
=== Performance Summary ===
Total Operations: 15
Total Duration: 1250.45s
Average Duration: 83.36s
Active Operations: 0
=== Operation Statistics ===
package_build:
Count: 5
Avg Duration: 45.23s
Min Duration: 32.10s
Max Duration: 67.89s
chroot_creation:
Count: 3
Avg Duration: 12.45s
Min Duration: 8.90s
Max Duration: 18.20s
Benchmarking Operations
# Benchmark a specific operation
deb-mock benchmark "chroot_creation" --function "init_chroot" --iterations 5
# Benchmark build operations
deb-mock benchmark "package_build" --function "build" --iterations 3
Output Example:
=== Benchmark Results for chroot_creation ===
Iterations: 5
Average Duration: 12.45s
Min Duration: 8.90s
Max Duration: 18.20s
Variance: 12.3456
Performance Reports
# Generate comprehensive performance report
deb-mock performance-report
# Generate report with custom output file
deb-mock performance-report --output-file "my_performance_report.txt"
Build Profile Reports
# Generate detailed report for a specific build
deb-mock build-profile-report "build_1234567890"
# Generate report with custom output file
deb-mock build-profile-report "build_1234567890" --output-file "build_analysis.txt"
Performance Analysis
# Analyze performance and generate optimization suggestions
deb-mock performance-analysis
Output Example:
=== Analysis 1: test-package ===
Performance Score: 85/100
Optimization Suggestions:
• Consider enabling parallel builds for faster execution
• Review chroot caching strategy for better performance
Automatic Tuning Recommendations:
• Low CPU utilization suggests room for more parallel builds
Current: 2
Suggested: 3
Manual Optimization Recommendations:
• Consider using tmpfs for /tmp to improve I/O performance
• Review and optimize chroot package selection
Optimization
# Show optimization recommendations
deb-mock optimize
# Automatically apply optimizations
deb-mock optimize --auto-apply
Metrics Management
# Export performance metrics
deb-mock export-metrics
# Export to specific file
deb-mock export-metrics --output-file "performance_data.json"
# Clean up old metrics
deb-mock cleanup-metrics
Programmatic Usage
Basic Performance Monitoring
from deb_mock.config import Config
from deb_mock.core import DebMock
# Create configuration with performance monitoring enabled
config = Config(
enable_performance_monitoring=True,
performance_auto_optimization=True
)
# Initialize deb-mock
deb_mock = DebMock(config)
# Build a package (performance monitoring happens automatically)
result = deb_mock.build("source-package")
# Get performance summary
summary = deb_mock.performance_monitor.get_performance_summary()
print(f"Total operations: {summary['total_operations']}")
Custom Performance Monitoring
# Monitor custom operations
with deb_mock.performance_monitor.monitor_operation("custom_operation") as op_id:
# Your custom operation here
result = perform_custom_operation()
# Add metrics to build profile
if hasattr(deb_mock, 'current_build_profile'):
deb_mock.performance_monitor.add_phase_metrics(
deb_mock.current_build_profile, "custom_operation", metrics
)
Benchmarking Custom Functions
def my_custom_function():
# Your function implementation
pass
# Benchmark the function
result = deb_mock.performance_monitor.benchmark_operation(
"my_custom_function", my_custom_function, iterations=5
)
print(f"Average duration: {result['average_duration']:.2f}s")
Performance Metrics
Collected Metrics
Operation Metrics
- Duration - Time taken for the operation
- CPU Usage - CPU utilization during operation
- Memory Usage - Memory consumption and changes
- Disk I/O - Read/write operations and data transfer
- Network I/O - Network data transfer
- Chroot Size - Size of chroot environment
- Cache Hit Rate - Effectiveness of caching
- Parallel Efficiency - Efficiency of parallel operations
- Resource Utilization - Overall resource usage
System Metrics
- CPU Percentage - Overall system CPU usage
- Memory Percentage - System memory utilization
- Disk Usage - Available disk space
- Active Operations - Currently running operations
Build Profile Structure
@dataclass
class BuildProfile:
build_id: str # Unique build identifier
package_name: str # Name of the package being built
architecture: str # Target architecture
suite: str # Debian suite
total_duration: float # Total build time
phases: Dict[str, PerformanceMetrics] # Performance data for each phase
resource_peak: Dict[str, float] # Peak resource usage
cache_performance: Dict[str, float] # Cache performance metrics
optimization_suggestions: List[str] # Generated optimization suggestions
timestamp: datetime # When the build was performed
Optimization System
Automatic Optimization Rules
Parallel Build Optimization
- Low CPU Usage (< 60%) - Increase parallel builds
- High CPU Usage (> 90%) - Decrease parallel builds
- Optimal Range - 70-85% CPU utilization
Cache Optimization
- Low Hit Rate (< 30%) - Increase cache size
- Medium Hit Rate (30-70%) - Review cache strategy
- High Hit Rate (> 70%) - Optimal performance
Resource Optimization
- Memory Usage > 2GB - Enable ccache, review dependencies
- Disk I/O High - Use tmpfs for temporary files
- Network Usage High - Review mirror configuration
Optimization Suggestions
Performance-Based
- Duration > 5 minutes - Enable parallel builds, optimize chroot caching
- Duration > 10 minutes - Review entire build process, consider system upgrades
Resource-Based
- CPU > 80% - Limit parallel jobs, optimize build processes
- Memory > 2GB - Enable ccache, review package dependencies
- Disk I/O High - Use tmpfs, optimize chroot structure
Cache-Based
- Hit Rate < 50% - Increase cache size, review retention policy
- Cache Misses High - Optimize cache invalidation strategy
Reporting and Analysis
Performance Reports
Comprehensive Report
- Performance Summary - Overall statistics and trends
- Operation Breakdown - Detailed analysis of each operation type
- System Statistics - Resource utilization patterns
- Optimization History - Applied optimizations and their effects
Build Profile Report
- Build Information - Package details and build parameters
- Phase Breakdown - Performance analysis of each build phase
- Resource Peaks - Maximum resource usage during build
- Cache Performance - Cache effectiveness metrics
- Optimization Suggestions - Specific recommendations for improvement
Data Export
JSON Export
{
"export_timestamp": "2024-08-19T12:00:00",
"summary": {
"total_operations": 15,
"total_duration": 1250.45,
"average_duration": 83.36
},
"build_profiles": {
"profile_1": {
"build_id": "build_1234567890",
"package_name": "test-package",
"total_duration": 45.23
}
},
"operation_history": [
{
"operation": "package_build",
"duration": 45.23,
"cpu_percent": 75.5
}
]
}
Text Reports
- Human-readable format - Easy to understand performance summaries
- Detailed breakdowns - Phase-by-phase performance analysis
- Optimization suggestions - Actionable recommendations
Best Practices
Performance Monitoring
- Enable by Default - Always enable performance monitoring in production
- Regular Analysis - Analyze performance data weekly
- Trend Tracking - Monitor performance trends over time
- Resource Planning - Use performance data for capacity planning
Optimization
- Start Conservative - Begin with conservative optimization settings
- Monitor Effects - Track the impact of optimizations
- Gradual Changes - Apply optimizations incrementally
- Test Thoroughly - Validate optimizations in test environments
Data Management
- Regular Cleanup - Clean up old metrics monthly
- Data Retention - Keep performance data for at least 30 days
- Export Important Data - Export critical performance data before cleanup
- Backup Metrics - Include performance metrics in system backups
Troubleshooting
Common Issues
Performance Monitoring Not Working
Symptoms: No performance data available, commands return empty results
Solutions:
- Check if
enable_performance_monitoringis set totrue - Verify
psutilpackage is installed - Check permissions for metrics directory
- Restart deb-mock to reinitialize monitoring
High Memory Usage
Symptoms: Memory usage > 2GB, build failures due to memory
Solutions:
- Enable ccache to reduce compilation memory usage
- Reduce parallel build count
- Review and optimize chroot package selection
- Increase system swap space
Slow Build Performance
Symptoms: Build duration > 5 minutes, high resource utilization
Solutions:
- Enable parallel builds (if CPU usage allows)
- Optimize chroot caching strategy
- Use tmpfs for temporary files
- Review build dependencies for unnecessary packages
Cache Performance Issues
Symptoms: Low cache hit rate, frequent cache misses
Solutions:
- Increase cache size
- Review cache retention policy
- Optimize cache invalidation strategy
- Check disk space availability
Debug Mode
Enable debug mode for detailed performance information:
export DEB_MOCK_DEBUG=1
deb-mock performance-summary
Getting Help
- Performance Reports - Generate detailed reports for analysis
- Log Files - Check deb-mock logs for performance-related errors
- System Monitoring - Use system tools to verify resource usage
- Community Support - Check project issues and discussions
Future Enhancements
Planned Features
- Real-time Dashboard - Web-based performance monitoring interface
- Machine Learning Optimization - AI-driven optimization suggestions
- Performance Alerts - Automated alerts for performance issues
- Integration with Monitoring Systems - Prometheus, Grafana, etc.
- Performance Regression Detection - Automatic detection of performance degradation
Extension Points
- Custom Metrics - User-defined performance metrics
- Performance Plugins - Extensible performance monitoring
- External Integrations - Third-party monitoring system support
- Performance Testing Framework - Automated performance testing
Integration Examples
CI/CD Integration
# GitHub Actions example
- name: Performance Analysis
run: |
deb-mock performance-analysis
deb-mock performance-report --output-file "performance_report.txt"
- name: Upload Performance Report
uses: actions/upload-artifact@v2
with:
name: performance-report
path: performance_report.txt
Monitoring System Integration
# Prometheus metrics export
from prometheus_client import Gauge, Histogram
# Create metrics
build_duration = Histogram('deb_mock_build_duration_seconds', 'Build duration in seconds')
cpu_usage = Gauge('deb_mock_cpu_usage_percent', 'CPU usage percentage')
# Export metrics
def export_prometheus_metrics(performance_monitor):
summary = performance_monitor.get_performance_summary()
for metrics in performance_monitor._operation_history:
build_duration.observe(metrics.duration)
cpu_usage.set(metrics.cpu_percent)
Performance Testing
# Automated performance testing
def test_build_performance():
config = Config(enable_performance_monitoring=True)
deb_mock = DebMock(config)
# Run benchmark
result = deb_mock.performance_monitor.benchmark_operation(
"test_build", deb_mock.build, iterations=5
)
# Assert performance requirements
assert result['average_duration'] < 60.0, "Build too slow"
assert result['variance'] < 10.0, "Build performance inconsistent"
print("✅ Performance requirements met")
This comprehensive performance monitoring and optimization system provides the tools needed to maintain optimal build performance, identify bottlenecks, and continuously improve the deb-mock build system.