Add comprehensive testing framework, performance monitoring, and plugin system
Some checks failed
Build Deb-Mock Package / build (push) Failing after 1m9s
Lint Code / Lint All Code (push) Failing after 1s
Test Deb-Mock Build / test (push) Failing after 35s

- Add complete pytest testing framework with conftest.py and test files
- Add performance monitoring and benchmarking capabilities
- Add plugin system with ccache plugin example
- Add comprehensive documentation (API, deployment, testing, etc.)
- Add Docker API wrapper for service deployment
- Add advanced configuration examples
- Remove old wget package file
- Update core modules with enhanced functionality
This commit is contained in:
robojerk 2025-08-19 20:49:32 -07:00
parent 4c0dcb2522
commit c51819c836
30 changed files with 11141 additions and 105 deletions

1117
docs/API.md Normal file

File diff suppressed because it is too large Load diff

764
docs/DEPLOYMENT.md Normal file
View file

@ -0,0 +1,764 @@
# Deb-Mock Deployment Guide
## Overview
This guide covers the deployment of `deb-mock` in various environments, from development to production. `deb-mock` is a sophisticated build environment management tool that provides isolated, reproducible package builds with advanced features like performance monitoring, plugin systems, and comprehensive testing.
## Table of Contents
1. [System Requirements](#system-requirements)
2. [Installation Methods](#installation-methods)
3. [Configuration](#configuration)
4. [Environment Setup](#environment-setup)
5. [Service Deployment](#service-deployment)
6. [Production Deployment](#production-deployment)
7. [Monitoring and Maintenance](#monitoring-and-maintenance)
8. [Troubleshooting](#troubleshooting)
9. [Security Considerations](#security-considerations)
10. [Backup and Recovery](#backup-and-recovery)
## System Requirements
### Minimum Requirements
- **Operating System**: Debian 13+ (Trixie) or Ubuntu 22.04+
- **CPU**: 2 cores (4 recommended)
- **Memory**: 4GB RAM (8GB recommended)
- **Storage**: 20GB available space (50GB recommended)
- **Python**: 3.8+ (3.10+ recommended)
### Recommended Requirements
- **Operating System**: Debian 13+ (Trixie) or Ubuntu 22.04+
- **CPU**: 8+ cores
- **Memory**: 16GB+ RAM
- **Storage**: 100GB+ available space (SSD recommended)
- **Python**: 3.10+
### Required System Packages
```bash
# Debian/Ubuntu
sudo apt update
sudo apt install -y \
python3 \
python3-pip \
python3-venv \
python3-dev \
build-essential \
debootstrap \
schroot \
sbuild \
ccache \
rsync \
curl \
wget \
git \
sudo \
procps \
sysstat \
iotop \
htop
# For advanced features
sudo apt install -y \
python3-psutil \
python3-yaml \
python3-click \
python3-rich \
python3-pytest \
python3-pytest-cov \
python3-pytest-mock \
python3-pytest-xdist \
python3-pytest-timeout \
python3-pytest-html \
python3-pytest-json-report \
python3-coverage
```
## Installation Methods
### Method 1: Python Package Installation (Recommended)
```bash
# Create virtual environment
python3 -m venv deb-mock-env
source deb-mock-env/bin/activate
# Install from source
git clone https://github.com/your-org/deb-mock.git
cd deb-mock
pip install -e .
# Or install from PyPI (when available)
pip install deb-mock
```
### Method 2: System-wide Installation
```bash
# Install system-wide (requires root)
sudo pip3 install deb-mock
# Or install from source
sudo pip3 install -e .
```
### Method 3: Docker Installation
```dockerfile
FROM debian:13-slim
# Install system dependencies
RUN apt-get update && apt-get install -y \
python3 \
python3-pip \
debootstrap \
schroot \
sbuild \
ccache \
&& rm -rf /var/lib/apt/lists/*
# Install deb-mock
COPY . /app/deb-mock
WORKDIR /app/deb-mock
RUN pip3 install -e .
# Set up entry point
ENTRYPOINT ["deb-mock"]
```
## Configuration
### Configuration File Structure
`deb-mock` uses YAML configuration files. The main configuration file is typically located at:
- **User config**: `~/.config/deb-mock/config.yaml`
- **System config**: `/etc/deb-mock/config.yaml`
- **Project config**: `./deb-mock.yaml`
### Basic Configuration Example
```yaml
# deb-mock.yaml
chroot:
base_dir: /var/lib/deb-mock/chroots
suite: trixie
architecture: amd64
mirror: http://deb.debian.org/debian/
components: [main, contrib, non-free]
cache:
enabled: true
base_dir: /var/cache/deb-mock
ccache_size_mb: 2048
root_cache_size_mb: 5120
package_cache_size_mb: 1024
sbuild:
enabled: true
user: sbuild
group: sbuild
chroot_suffix: -sbuild
build_user: buildd
performance:
enable_performance_monitoring: true
performance_metrics_dir: /var/log/deb-mock/performance
performance_retention_days: 30
performance_auto_optimization: true
performance_benchmark_iterations: 10
performance_reporting: true
plugins:
enabled: true
plugin_dir: /usr/local/lib/deb-mock/plugins
auto_load: true
parallel:
enabled: true
max_parallel_builds: 4
max_parallel_chroots: 8
mounts:
proc: true
sys: true
dev: true
tmpfs: true
bind_mounts:
- source: /var/cache/apt/archives
target: /var/cache/apt/archives
options: [ro]
overlay_mounts:
- source: /var/cache/deb-mock/overlay
target: /var/cache/deb-mock/overlay
uid_management:
enabled: true
create_users: true
copy_host_users: true
privilege_escalation: true
```
### Environment Variables
```bash
# Core configuration
export DEB_MOCK_CONFIG=/path/to/config.yaml
export DEB_MOCK_CHROOT_DIR=/var/lib/deb-mock/chroots
export DEB_MOCK_CACHE_DIR=/var/cache/deb-mock
# Performance monitoring
export DEB_MOCK_PERFORMANCE_DIR=/var/log/deb-mock/performance
export DEB_MOCK_ENABLE_PERFORMANCE_MONITORING=true
# Plugin system
export DEB_MOCK_PLUGIN_DIR=/usr/local/lib/deb-mock/plugins
export DEB_MOCK_AUTO_LOAD_PLUGINS=true
# Logging
export DEB_MOCK_LOG_LEVEL=INFO
export DEB_MOCK_LOG_FILE=/var/log/deb-mock/deb-mock.log
```
## Environment Setup
### User Setup
```bash
# Create deb-mock user
sudo useradd -m -s /bin/bash deb-mock
sudo usermod -aG sbuild deb-mock
# Set up user environment
sudo -u deb-mock mkdir -p ~/.config/deb-mock
sudo -u deb-mock mkdir -p ~/.cache/deb-mock
sudo -u deb-mock mkdir -p ~/deb-mock-workspace
# Configure sbuild for the user
sudo -u deb-mock sbuild-update --keygen
sudo -u deb-mock sbuild-adduser $USER
```
### Directory Structure Setup
```bash
# Create necessary directories
sudo mkdir -p /var/lib/deb-mock/chroots
sudo mkdir -p /var/cache/deb-mock/{ccache,root,packages,overlay}
sudo mkdir -p /var/log/deb-mock/{performance,logs}
sudo mkdir -p /usr/local/lib/deb-mock/plugins
# Set proper permissions
sudo chown -R deb-mock:deb-mock /var/lib/deb-mock
sudo chown -R deb-mock:deb-mock /var/cache/deb-mock
sudo chown -R deb-mock:deb-mock /var/log/deb-mock
sudo chown -R deb-mock:deb-mock /usr/local/lib/deb-mock
# Set proper permissions for sbuild
sudo chown -R deb-mock:sbuild /var/lib/deb-mock/chroots
sudo chmod 775 /var/lib/deb-mock/chroots
```
### Sbuild Configuration
```bash
# Configure sbuild for deb-mock user
sudo -u deb-mock mkdir -p ~/.config/sbuild
sudo -u deb-mock cat > ~/.config/sbuild/config.pl << 'EOF'
$build_arch = 'amd64';
$build_arch_all = 1;
$build_source = 1;
$build_binary = 1;
$build_arch_indep = 1;
$build_arch_all = 1;
$build_profiles = ['default'];
$build_environment = ['debian'];
$build_suite = 'trixie';
$build_components = ['main', 'contrib', 'non-free'];
$build_mirror = 'http://deb.debian.org/debian/';
$build_indep = 1;
$build_arch_all = 1;
$build_source = 1;
$build_binary = 1;
$build_arch_indep = 1;
$build_arch_all = 1;
$build_profiles = ['default'];
$build_environment = ['debian'];
$build_suite = 'trixie';
$build_components = ['main', 'contrib', 'non-free'];
$build_mirror = 'http://deb.debian.org/debian/';
EOF
```
## Service Deployment
### Systemd Service (Recommended)
Create a systemd service file for production deployments:
```ini
# /etc/systemd/system/deb-mock.service
[Unit]
Description=Deb-Mock Build Service
After=network.target
Wants=network.target
[Service]
Type=simple
User=deb-mock
Group=deb-mock
Environment=DEB_MOCK_CONFIG=/etc/deb-mock/config.yaml
Environment=DEB_MOCK_LOG_LEVEL=INFO
Environment=DEB_MOCK_LOG_FILE=/var/log/deb-mock/deb-mock.log
WorkingDirectory=/var/lib/deb-mock
ExecStart=/usr/local/bin/deb-mock service start
ExecReload=/bin/kill -HUP $MAINPID
Restart=always
RestartSec=10
StandardOutput=journal
StandardError=journal
# Security settings
NoNewPrivileges=true
PrivateTmp=true
ProtectSystem=strict
ProtectHome=true
ReadWritePaths=/var/lib/deb-mock /var/cache/deb-mock /var/log/deb-mock
[Install]
WantedBy=multi-user.target
```
### Service Management
```bash
# Enable and start the service
sudo systemctl daemon-reload
sudo systemctl enable deb-mock.service
sudo systemctl start deb-mock.service
# Check service status
sudo systemctl status deb-mock.service
# View logs
sudo journalctl -u deb-mock.service -f
# Restart service
sudo systemctl restart deb-mock.service
```
### Docker Compose Deployment
```yaml
# docker-compose.yml
version: '3.8'
services:
deb-mock:
build: .
container_name: deb-mock
restart: unless-stopped
environment:
- DEB_MOCK_CONFIG=/etc/deb-mock/config.yaml
- DEB_MOCK_LOG_LEVEL=INFO
volumes:
- ./config:/etc/deb-mock:ro
- deb-mock-chroots:/var/lib/deb-mock/chroots
- deb-mock-cache:/var/cache/deb-mock
- deb-mock-logs:/var/log/deb-mock
ports:
- "8080:8080"
networks:
- deb-mock-network
volumes:
deb-mock-chroots:
deb-mock-cache:
deb-mock-logs:
networks:
deb-mock-network:
driver: bridge
```
## Production Deployment
### High Availability Setup
```bash
# Load balancer configuration (nginx)
sudo apt install nginx
# Create nginx configuration
sudo tee /etc/nginx/sites-available/deb-mock << 'EOF'
upstream deb-mock_backend {
server 127.0.0.1:8080;
server 127.0.0.1:8081;
server 127.0.0.1:8082;
}
server {
listen 80;
server_name deb-mock.yourdomain.com;
location / {
proxy_pass http://deb-mock_backend;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
EOF
# Enable site
sudo ln -s /etc/nginx/sites-available/deb-mock /etc/nginx/sites-enabled/
sudo nginx -t
sudo systemctl reload nginx
```
### Monitoring Setup
```bash
# Install monitoring tools
sudo apt install -y prometheus node-exporter grafana
# Configure Prometheus
sudo tee /etc/prometheus/prometheus.yml << 'EOF'
global:
scrape_interval: 15s
scrape_configs:
- job_name: 'deb-mock'
static_configs:
- targets: ['localhost:8080']
metrics_path: /metrics
scrape_interval: 5s
- job_name: 'node-exporter'
static_configs:
- targets: ['localhost:9100']
EOF
# Start monitoring services
sudo systemctl enable prometheus node-exporter grafana-server
sudo systemctl start prometheus node-exporter grafana-server
```
### Backup Strategy
```bash
# Create backup script
sudo tee /usr/local/bin/deb-mock-backup << 'EOF'
#!/bin/bash
BACKUP_DIR="/var/backups/deb-mock"
DATE=$(date +%Y%m%d_%H%M%S)
# Create backup directory
mkdir -p "$BACKUP_DIR"
# Backup configuration
tar -czf "$BACKUP_DIR/config_$DATE.tar.gz" -C /etc deb-mock
# Backup chroots (excluding temporary files)
tar -czf "$BACKUP_DIR/chroots_$DATE.tar.gz" \
--exclude='*/tmp/*' \
--exclude='*/var/tmp/*' \
-C /var/lib deb-mock/chroots
# Backup cache
tar -czf "$BACKUP_DIR/cache_$DATE.tar.gz" -C /var/cache deb-mock
# Backup logs
tar -czf "$BACKUP_DIR/logs_$DATE.tar.gz" -C /var/log deb-mock
# Clean up old backups (keep last 7 days)
find "$BACKUP_DIR" -name "*.tar.gz" -mtime +7 -delete
echo "Backup completed: $BACKUP_DIR"
EOF
# Make executable and set up cron
sudo chmod +x /usr/local/bin/deb-mock-backup
sudo crontab -e
# Add: 0 2 * * * /usr/local/bin/deb-mock-backup
```
## Monitoring and Maintenance
### Health Checks
```bash
# Create health check script
sudo tee /usr/local/bin/deb-mock-health << 'EOF'
#!/bin/bash
# Check service status
if ! systemctl is-active --quiet deb-mock.service; then
echo "ERROR: deb-mock service is not running"
exit 1
fi
# Check disk space
DISK_USAGE=$(df /var/lib/deb-mock | tail -1 | awk '{print $5}' | sed 's/%//')
if [ "$DISK_USAGE" -gt 90 ]; then
echo "WARNING: Disk usage is ${DISK_USAGE}%"
fi
# Check memory usage
MEM_USAGE=$(free | grep Mem | awk '{printf("%.0f", $3/$2 * 100.0)}')
if [ "$MEM_USAGE" -gt 90 ]; then
echo "WARNING: Memory usage is ${MEM_USAGE}%"
fi
# Check chroot health
if ! deb-mock status >/dev/null 2>&1; then
echo "ERROR: deb-mock status check failed"
exit 1
fi
echo "OK: All health checks passed"
EOF
sudo chmod +x /usr/local/bin/deb-mock-health
```
### Log Rotation
```bash
# Configure log rotation
sudo tee /etc/logrotate.d/deb-mock << 'EOF'
/var/log/deb-mock/*.log {
daily
missingok
rotate 52
compress
delaycompress
notifempty
create 644 deb-mock deb-mock
postrotate
systemctl reload deb-mock.service >/dev/null 2>&1 || true
endscript
}
EOF
```
### Performance Monitoring
```bash
# Set up performance monitoring
sudo -u deb-mock mkdir -p /var/log/deb-mock/performance
# Create performance monitoring script
sudo tee /usr/local/bin/deb-mock-performance << 'EOF'
#!/bin/bash
# Generate performance report
deb-mock performance-report --output /var/log/deb-mock/performance/report_$(date +%Y%m%d_%H%M%S).html
# Clean up old reports (keep last 30 days)
find /var/log/deb-mock/performance -name "report_*.html" -mtime +30 -delete
# Generate benchmark report if needed
if [ "$1" = "benchmark" ]; then
deb-mock benchmark --template standard --iterations 20
fi
EOF
sudo chmod +x /usr/local/bin/deb-mock-performance
```
## Troubleshooting
### Common Issues
#### Service Won't Start
```bash
# Check service status
sudo systemctl status deb-mock.service
# Check logs
sudo journalctl -u deb-mock.service -n 50
# Check configuration
deb-mock --config /etc/deb-mock/config.yaml validate
# Check permissions
sudo ls -la /var/lib/deb-mock/
sudo ls -la /var/cache/deb-mock/
```
#### Chroot Issues
```bash
# List chroots
deb-mock list-chroots
# Check chroot status
deb-mock status
# Clean up broken chroots
deb-mock cleanup --force
# Rebuild chroot
deb-mock create-chroot --suite trixie --architecture amd64
```
#### Performance Issues
```bash
# Check performance metrics
deb-mock performance-summary
# Run performance analysis
deb-mock performance-analysis
# Generate performance report
deb-mock performance-report
# Run benchmarks
deb-mock benchmark --template comprehensive
```
### Debug Mode
```bash
# Enable debug logging
export DEB_MOCK_LOG_LEVEL=DEBUG
export DEB_MOCK_DEBUG=true
# Run with verbose output
deb-mock --verbose --debug build package-name
# Check system resources
deb-mock --debug status
```
## Security Considerations
### User Isolation
```bash
# Create dedicated user for deb-mock
sudo useradd -r -s /bin/false -d /var/lib/deb-mock deb-mock
# Set up proper file permissions
sudo chown -R deb-mock:deb-mock /var/lib/deb-mock
sudo chmod 750 /var/lib/deb-mock/chroots
sudo chmod 640 /var/log/deb-mock/*.log
```
### Network Security
```bash
# Configure firewall
sudo ufw allow from 192.168.1.0/24 to any port 8080
sudo ufw enable
# Use reverse proxy with SSL
sudo apt install certbot python3-certbot-nginx
sudo certbot --nginx -d deb-mock.yourdomain.com
```
### Access Control
```bash
# Set up API key authentication
export DEB_MOCK_API_KEY=your-secure-api-key
# Configure RBAC
sudo tee /etc/deb-mock/rbac.yaml << 'EOF'
roles:
admin:
permissions: ["*"]
builder:
permissions: ["build", "status", "logs"]
viewer:
permissions: ["status", "logs"]
users:
admin@example.com:
role: admin
builder@example.com:
role: builder
viewer@example.com:
role: viewer
EOF
```
## Backup and Recovery
### Automated Backups
```bash
# Create backup script
sudo tee /usr/local/bin/deb-mock-backup-full << 'EOF'
#!/bin/bash
BACKUP_DIR="/var/backups/deb-mock/full"
DATE=$(date +%Y%m%d_%H%M%S)
# Create backup directory
mkdir -p "$BACKUP_DIR"
# Stop service
systemctl stop deb-mock.service
# Create full backup
tar -czf "$BACKUP_DIR/full_backup_$DATE.tar.gz" \
-C /var lib/deb-mock \
-C /var cache/deb-mock \
-C /var log/deb-mock \
-C /etc deb-mock
# Start service
systemctl start deb-mock.service
# Clean up old backups (keep last 30 days)
find "$BACKUP_DIR" -name "full_backup_*.tar.gz" -mtime +30 -delete
echo "Full backup completed: $BACKUP_DIR/full_backup_$DATE.tar.gz"
EOF
sudo chmod +x /usr/local/bin/deb-mock-backup-full
```
### Recovery Procedures
```bash
# Restore from backup
sudo systemctl stop deb-mock.service
# Extract backup
sudo tar -xzf /var/backups/deb-mock/full/full_backup_YYYYMMDD_HHMMSS.tar.gz -C /
# Fix permissions
sudo chown -R deb-mock:deb-mock /var/lib/deb-mock
sudo chown -R deb-mock:deb-mock /var/cache/deb-mock
sudo chown -R deb-mock:deb-mock /var/log/deb-mock
# Start service
sudo systemctl start deb-mock.service
# Verify recovery
deb-mock status
```
## Conclusion
This deployment guide provides comprehensive instructions for deploying `deb-mock` in various environments. For production deployments, ensure you have proper monitoring, backup, and security measures in place.
For additional support and troubleshooting, refer to the main documentation or contact the development team.
## Additional Resources
- [Main Documentation](../README.md)
- [Configuration Guide](CONFIGURATION.md)
- [Performance Monitoring](PERFORMANCE_MONITORING.md)
- [Plugin System](PLUGIN_SYSTEM.md)
- [Testing Guide](TESTING.md)
- [API Reference](API.md)

View file

@ -0,0 +1,525 @@
# Deb-Mock Performance Monitoring and Optimization
## Overview
The `deb-mock` performance monitoring and optimization system provides comprehensive insights into build performance, automatic optimization suggestions, and detailed performance analytics. This system enables users to identify bottlenecks, optimize build configurations, and maintain optimal performance across different build environments.
## Features
- **Real-time Performance Monitoring** - Track CPU, memory, disk I/O, and network usage during builds
- **Build Profiling** - Detailed analysis of each build phase with performance metrics
- **Automatic Optimization** - AI-driven suggestions for improving build performance
- **Benchmarking** - Multi-iteration performance testing for accurate measurements
- **Performance Reporting** - Comprehensive reports and visualizations
- **Resource Utilization Analysis** - Identify resource bottlenecks and optimization opportunities
- **Cache Performance Tracking** - Monitor cache hit rates and effectiveness
- **Automatic Tuning** - Apply optimization recommendations automatically
## Architecture
### Core Components
1. **PerformanceMonitor** - Real-time monitoring and metrics collection
2. **PerformanceOptimizer** - Analysis and optimization recommendations
3. **PerformanceReporter** - Report generation and data export
4. **Build Profiles** - Detailed performance tracking for individual builds
5. **System Monitoring** - Background system resource tracking
### Data Flow
```
Build Operation → Performance Monitor → Metrics Collection → Build Profile → Analysis → Optimization Suggestions
```
## Configuration
### Performance Monitoring Settings
```yaml
# Performance monitoring configuration
enable_performance_monitoring: true
performance_metrics_dir: "./performance-metrics"
performance_retention_days: 30
performance_auto_optimization: false
performance_benchmark_iterations: 3
performance_reporting: true
```
### Configuration Options
- **`enable_performance_monitoring`** - Enable/disable performance monitoring (default: true)
- **`performance_metrics_dir`** - Directory for storing performance data (default: "./performance-metrics")
- **`performance_retention_days`** - How long to keep performance data (default: 30)
- **`performance_auto_optimization`** - Automatically apply optimization suggestions (default: false)
- **`performance_benchmark_iterations`** - Number of iterations for benchmarking (default: 3)
- **`performance_reporting`** - Enable performance reporting (default: true)
## Usage
### CLI Commands
#### Performance Summary
```bash
# Show overall performance statistics
deb-mock performance-summary
```
**Output Example:**
```
=== Performance Summary ===
Total Operations: 15
Total Duration: 1250.45s
Average Duration: 83.36s
Active Operations: 0
=== Operation Statistics ===
package_build:
Count: 5
Avg Duration: 45.23s
Min Duration: 32.10s
Max Duration: 67.89s
chroot_creation:
Count: 3
Avg Duration: 12.45s
Min Duration: 8.90s
Max Duration: 18.20s
```
#### Benchmarking Operations
```bash
# Benchmark a specific operation
deb-mock benchmark "chroot_creation" --function "init_chroot" --iterations 5
# Benchmark build operations
deb-mock benchmark "package_build" --function "build" --iterations 3
```
**Output Example:**
```
=== Benchmark Results for chroot_creation ===
Iterations: 5
Average Duration: 12.45s
Min Duration: 8.90s
Max Duration: 18.20s
Variance: 12.3456
```
#### Performance Reports
```bash
# Generate comprehensive performance report
deb-mock performance-report
# Generate report with custom output file
deb-mock performance-report --output-file "my_performance_report.txt"
```
#### Build Profile Reports
```bash
# Generate detailed report for a specific build
deb-mock build-profile-report "build_1234567890"
# Generate report with custom output file
deb-mock build-profile-report "build_1234567890" --output-file "build_analysis.txt"
```
#### Performance Analysis
```bash
# Analyze performance and generate optimization suggestions
deb-mock performance-analysis
```
**Output Example:**
```
=== Analysis 1: test-package ===
Performance Score: 85/100
Optimization Suggestions:
• Consider enabling parallel builds for faster execution
• Review chroot caching strategy for better performance
Automatic Tuning Recommendations:
• Low CPU utilization suggests room for more parallel builds
Current: 2
Suggested: 3
Manual Optimization Recommendations:
• Consider using tmpfs for /tmp to improve I/O performance
• Review and optimize chroot package selection
```
#### Optimization
```bash
# Show optimization recommendations
deb-mock optimize
# Automatically apply optimizations
deb-mock optimize --auto-apply
```
#### Metrics Management
```bash
# Export performance metrics
deb-mock export-metrics
# Export to specific file
deb-mock export-metrics --output-file "performance_data.json"
# Clean up old metrics
deb-mock cleanup-metrics
```
### Programmatic Usage
#### Basic Performance Monitoring
```python
from deb_mock.config import Config
from deb_mock.core import DebMock
# Create configuration with performance monitoring enabled
config = Config(
enable_performance_monitoring=True,
performance_auto_optimization=True
)
# Initialize deb-mock
deb_mock = DebMock(config)
# Build a package (performance monitoring happens automatically)
result = deb_mock.build("source-package")
# Get performance summary
summary = deb_mock.performance_monitor.get_performance_summary()
print(f"Total operations: {summary['total_operations']}")
```
#### Custom Performance Monitoring
```python
# Monitor custom operations
with deb_mock.performance_monitor.monitor_operation("custom_operation") as op_id:
# Your custom operation here
result = perform_custom_operation()
# Add metrics to build profile
if hasattr(deb_mock, 'current_build_profile'):
deb_mock.performance_monitor.add_phase_metrics(
deb_mock.current_build_profile, "custom_operation", metrics
)
```
#### Benchmarking Custom Functions
```python
def my_custom_function():
# Your function implementation
pass
# Benchmark the function
result = deb_mock.performance_monitor.benchmark_operation(
"my_custom_function", my_custom_function, iterations=5
)
print(f"Average duration: {result['average_duration']:.2f}s")
```
## Performance Metrics
### Collected Metrics
#### Operation Metrics
- **Duration** - Time taken for the operation
- **CPU Usage** - CPU utilization during operation
- **Memory Usage** - Memory consumption and changes
- **Disk I/O** - Read/write operations and data transfer
- **Network I/O** - Network data transfer
- **Chroot Size** - Size of chroot environment
- **Cache Hit Rate** - Effectiveness of caching
- **Parallel Efficiency** - Efficiency of parallel operations
- **Resource Utilization** - Overall resource usage
#### System Metrics
- **CPU Percentage** - Overall system CPU usage
- **Memory Percentage** - System memory utilization
- **Disk Usage** - Available disk space
- **Active Operations** - Currently running operations
### Build Profile Structure
```python
@dataclass
class BuildProfile:
build_id: str # Unique build identifier
package_name: str # Name of the package being built
architecture: str # Target architecture
suite: str # Debian suite
total_duration: float # Total build time
phases: Dict[str, PerformanceMetrics] # Performance data for each phase
resource_peak: Dict[str, float] # Peak resource usage
cache_performance: Dict[str, float] # Cache performance metrics
optimization_suggestions: List[str] # Generated optimization suggestions
timestamp: datetime # When the build was performed
```
## Optimization System
### Automatic Optimization Rules
#### Parallel Build Optimization
- **Low CPU Usage (< 60%)** - Increase parallel builds
- **High CPU Usage (> 90%)** - Decrease parallel builds
- **Optimal Range** - 70-85% CPU utilization
#### Cache Optimization
- **Low Hit Rate (< 30%)** - Increase cache size
- **Medium Hit Rate (30-70%)** - Review cache strategy
- **High Hit Rate (> 70%)** - Optimal performance
#### Resource Optimization
- **Memory Usage > 2GB** - Enable ccache, review dependencies
- **Disk I/O High** - Use tmpfs for temporary files
- **Network Usage High** - Review mirror configuration
### Optimization Suggestions
#### Performance-Based
1. **Duration > 5 minutes** - Enable parallel builds, optimize chroot caching
2. **Duration > 10 minutes** - Review entire build process, consider system upgrades
#### Resource-Based
1. **CPU > 80%** - Limit parallel jobs, optimize build processes
2. **Memory > 2GB** - Enable ccache, review package dependencies
3. **Disk I/O High** - Use tmpfs, optimize chroot structure
#### Cache-Based
1. **Hit Rate < 50%** - Increase cache size, review retention policy
2. **Cache Misses High** - Optimize cache invalidation strategy
## Reporting and Analysis
### Performance Reports
#### Comprehensive Report
- **Performance Summary** - Overall statistics and trends
- **Operation Breakdown** - Detailed analysis of each operation type
- **System Statistics** - Resource utilization patterns
- **Optimization History** - Applied optimizations and their effects
#### Build Profile Report
- **Build Information** - Package details and build parameters
- **Phase Breakdown** - Performance analysis of each build phase
- **Resource Peaks** - Maximum resource usage during build
- **Cache Performance** - Cache effectiveness metrics
- **Optimization Suggestions** - Specific recommendations for improvement
### Data Export
#### JSON Export
```json
{
"export_timestamp": "2024-08-19T12:00:00",
"summary": {
"total_operations": 15,
"total_duration": 1250.45,
"average_duration": 83.36
},
"build_profiles": {
"profile_1": {
"build_id": "build_1234567890",
"package_name": "test-package",
"total_duration": 45.23
}
},
"operation_history": [
{
"operation": "package_build",
"duration": 45.23,
"cpu_percent": 75.5
}
]
}
```
#### Text Reports
- **Human-readable format** - Easy to understand performance summaries
- **Detailed breakdowns** - Phase-by-phase performance analysis
- **Optimization suggestions** - Actionable recommendations
## Best Practices
### Performance Monitoring
1. **Enable by Default** - Always enable performance monitoring in production
2. **Regular Analysis** - Analyze performance data weekly
3. **Trend Tracking** - Monitor performance trends over time
4. **Resource Planning** - Use performance data for capacity planning
### Optimization
1. **Start Conservative** - Begin with conservative optimization settings
2. **Monitor Effects** - Track the impact of optimizations
3. **Gradual Changes** - Apply optimizations incrementally
4. **Test Thoroughly** - Validate optimizations in test environments
### Data Management
1. **Regular Cleanup** - Clean up old metrics monthly
2. **Data Retention** - Keep performance data for at least 30 days
3. **Export Important Data** - Export critical performance data before cleanup
4. **Backup Metrics** - Include performance metrics in system backups
## Troubleshooting
### Common Issues
#### Performance Monitoring Not Working
**Symptoms**: No performance data available, commands return empty results
**Solutions**:
1. Check if `enable_performance_monitoring` is set to `true`
2. Verify `psutil` package is installed
3. Check permissions for metrics directory
4. Restart deb-mock to reinitialize monitoring
#### High Memory Usage
**Symptoms**: Memory usage > 2GB, build failures due to memory
**Solutions**:
1. Enable ccache to reduce compilation memory usage
2. Reduce parallel build count
3. Review and optimize chroot package selection
4. Increase system swap space
#### Slow Build Performance
**Symptoms**: Build duration > 5 minutes, high resource utilization
**Solutions**:
1. Enable parallel builds (if CPU usage allows)
2. Optimize chroot caching strategy
3. Use tmpfs for temporary files
4. Review build dependencies for unnecessary packages
#### Cache Performance Issues
**Symptoms**: Low cache hit rate, frequent cache misses
**Solutions**:
1. Increase cache size
2. Review cache retention policy
3. Optimize cache invalidation strategy
4. Check disk space availability
### Debug Mode
Enable debug mode for detailed performance information:
```bash
export DEB_MOCK_DEBUG=1
deb-mock performance-summary
```
### Getting Help
- **Performance Reports** - Generate detailed reports for analysis
- **Log Files** - Check deb-mock logs for performance-related errors
- **System Monitoring** - Use system tools to verify resource usage
- **Community Support** - Check project issues and discussions
## Future Enhancements
### Planned Features
- **Real-time Dashboard** - Web-based performance monitoring interface
- **Machine Learning Optimization** - AI-driven optimization suggestions
- **Performance Alerts** - Automated alerts for performance issues
- **Integration with Monitoring Systems** - Prometheus, Grafana, etc.
- **Performance Regression Detection** - Automatic detection of performance degradation
### Extension Points
- **Custom Metrics** - User-defined performance metrics
- **Performance Plugins** - Extensible performance monitoring
- **External Integrations** - Third-party monitoring system support
- **Performance Testing Framework** - Automated performance testing
## Integration Examples
### CI/CD Integration
```yaml
# GitHub Actions example
- name: Performance Analysis
run: |
deb-mock performance-analysis
deb-mock performance-report --output-file "performance_report.txt"
- name: Upload Performance Report
uses: actions/upload-artifact@v2
with:
name: performance-report
path: performance_report.txt
```
### Monitoring System Integration
```python
# Prometheus metrics export
from prometheus_client import Gauge, Histogram
# Create metrics
build_duration = Histogram('deb_mock_build_duration_seconds', 'Build duration in seconds')
cpu_usage = Gauge('deb_mock_cpu_usage_percent', 'CPU usage percentage')
# Export metrics
def export_prometheus_metrics(performance_monitor):
summary = performance_monitor.get_performance_summary()
for metrics in performance_monitor._operation_history:
build_duration.observe(metrics.duration)
cpu_usage.set(metrics.cpu_percent)
```
### Performance Testing
```python
# Automated performance testing
def test_build_performance():
config = Config(enable_performance_monitoring=True)
deb_mock = DebMock(config)
# Run benchmark
result = deb_mock.performance_monitor.benchmark_operation(
"test_build", deb_mock.build, iterations=5
)
# Assert performance requirements
assert result['average_duration'] < 60.0, "Build too slow"
assert result['variance'] < 10.0, "Build performance inconsistent"
print("✅ Performance requirements met")
```
This comprehensive performance monitoring and optimization system provides the tools needed to maintain optimal build performance, identify bottlenecks, and continuously improve the deb-mock build system.

322
docs/PLUGIN_SYSTEM.md Normal file
View file

@ -0,0 +1,322 @@
# Deb-Mock Plugin System
## Overview
The deb-mock plugin system provides a powerful and extensible way to customize build behavior, add new features, and integrate with external tools. It's based on Fedora Mock's proven plugin architecture, adapted specifically for Debian-based build environments.
## Features
- **Hook-based architecture** - Plugins can hook into various stages of the build process
- **Dynamic loading** - Plugins are loaded at runtime based on configuration
- **API versioning** - Ensures compatibility between deb-mock versions and plugins
- **Configuration-driven** - Rich configuration options for each plugin
- **Error handling** - Robust error handling with required vs. optional plugins
- **Base classes** - Helper classes for easier plugin development
## Architecture
### Core Components
1. **PluginManager** - Main plugin orchestration class
2. **HookStages** - Standard hook stages for plugins
3. **BasePlugin** - Base class for plugin development
4. **Plugin Configuration** - YAML-based plugin configuration
### Hook Stages
The plugin system provides hooks at various stages of the build process:
#### Chroot Lifecycle
- `prechroot_init` - Before chroot initialization
- `postchroot_init` - After chroot initialization
- `prechroot_clean` - Before chroot cleanup
- `postchroot_clean` - After chroot cleanup
#### Build Lifecycle
- `prebuild` - Before build starts
- `postbuild` - After build completes
- `build_start` - When build begins
- `build_end` - When build ends
#### Package Management
- `pre_install_deps` - Before installing dependencies
- `post_install_deps` - After installing dependencies
- `pre_install_package` - Before installing packages
- `post_install_package` - After installing packages
#### Mount Management
- `pre_mount` - Before mounting filesystems
- `post_mount` - After mounting filesystems
- `pre_unmount` - Before unmounting filesystems
- `post_unmount` - After unmounting filesystems
#### Cache Management
- `pre_cache_create` - Before creating caches
- `post_cache_create` - After creating caches
- `pre_cache_restore` - Before restoring caches
- `post_cache_restore` - After restoring caches
#### Parallel Build Hooks
- `pre_parallel_build` - Before parallel builds
- `post_parallel_build` - After parallel builds
- `parallel_build_start` - When parallel build starts
- `parallel_build_end` - When parallel build ends
#### Error Handling
- `on_error` - When errors occur
- `on_warning` - When warnings occur
## Configuration
### Basic Plugin Configuration
```yaml
# Enable plugins
plugins: ["ccache_plugin", "build_monitor"]
# Plugin directory (optional)
plugin_dir: "./plugins"
# Plugin-specific configuration
plugin_conf:
# CCache plugin
ccache_enable: true
ccache_required: false
ccache_opts:
dir: "/var/cache/deb-mock/ccache"
max_cache_size: "4G"
show_stats: true
compress: true
hashdir: true
debug: false
# Build monitor plugin
build_monitor_enable: true
build_monitor_required: false
build_monitor_opts:
log_file: "/var/log/deb-mock/builds.log"
notify_on_completion: true
track_build_time: true
```
### Plugin Configuration Options
- **`{plugin}_enable`** - Enable/disable plugin (default: true)
- **`{plugin}_required`** - Make plugin required (default: false)
- **`{plugin}_opts`** - Plugin-specific configuration options
## Plugin Development
### Basic Plugin Structure
```python
#!/usr/bin/env python3
"""
Example plugin for deb-mock
"""
requires_api_version = "1.0"
run_in_bootstrap = False
def init(plugin_manager, conf, deb_mock):
"""Plugin entry point"""
ExamplePlugin(plugin_manager, conf, deb_mock)
class ExamplePlugin:
"""Example plugin implementation"""
def __init__(self, plugin_manager, conf, deb_mock):
self.plugin_manager = plugin_manager
self.conf = conf
self.deb_mock = deb_mock
# Register hooks
self._register_hooks()
def _register_hooks(self):
"""Register plugin hooks"""
self.plugin_manager.add_hook("prebuild", self._prebuild_hook)
self.plugin_manager.add_hook("postbuild", self._postbuild_hook)
def _prebuild_hook(self, source_package, **kwargs):
"""Hook called before build starts"""
print(f"Example plugin: Pre-build hook for {source_package}")
def _postbuild_hook(self, build_result, source_package, **kwargs):
"""Hook called after build completes"""
print(f"Example plugin: Post-build hook for {source_package}")
```
### Using BasePlugin Class
```python
from deb_mock.plugin import BasePlugin, HookStages
class MyPlugin(BasePlugin):
"""Plugin using the base class"""
def _register_hooks(self):
"""Override to register hooks"""
self.plugin_manager.add_hook(HookStages.PREBUILD, self._my_hook)
def _my_hook(self, source_package, **kwargs):
"""My custom hook"""
self.log_info(f"Processing {source_package}")
# Plugin logic here
```
### Plugin API Requirements
Every plugin must define:
1. **`requires_api_version`** - API version compatibility
2. **`run_in_bootstrap`** - Whether to run in bootstrap chroots
3. **`init()` function** - Plugin entry point
### Available Hooks
Plugins can register hooks for any of the standard stages defined in `HookStages`, or create custom stages.
## Built-in Plugins
### CCache Plugin
The CCache plugin provides compiler caching for faster rebuilds:
```yaml
plugin_conf:
ccache_enable: true
ccache_opts:
dir: "/var/cache/deb-mock/ccache"
max_cache_size: "4G"
show_stats: true
compress: true
hashdir: true
debug: false
```
**Features:**
- Automatic ccache setup in chroots
- Configurable cache size and options
- Build statistics reporting
- Environment variable management
### Build Monitor Plugin
The Build Monitor plugin tracks build performance and provides notifications:
```yaml
plugin_conf:
build_monitor_enable: true
build_monitor_opts:
log_file: "/var/log/deb-mock/builds.log"
notify_on_completion: true
track_build_time: true
performance_metrics: true
```
**Features:**
- Build time tracking
- Performance metrics collection
- Completion notifications
- Detailed logging
## CLI Commands
### Plugin Management
```bash
# Show plugin information
deb-mock plugin-info
# List available hook stages
deb-mock list-stages
# List hooks for a specific stage
deb-mock list-hooks prebuild
```
### Plugin Configuration
Plugins are configured through the main configuration file or command-line options. The plugin system automatically loads enabled plugins and initializes them with the deb-mock instance.
## Best Practices
### Plugin Development
1. **Use descriptive names** - Choose clear, descriptive plugin names
2. **Handle errors gracefully** - Don't let plugin failures break builds
3. **Use logging** - Use the provided logging methods for debugging
4. **Validate configuration** - Check configuration values and provide defaults
5. **Document hooks** - Clearly document what each hook does
### Configuration
1. **Enable only needed plugins** - Don't enable plugins you don't use
2. **Use required sparingly** - Only mark plugins as required if builds fail without them
3. **Provide defaults** - Always provide sensible default values
4. **Test configurations** - Test plugin configurations before production use
### Performance
1. **Minimize hook overhead** - Keep hooks lightweight
2. **Use async when possible** - Consider async operations for I/O heavy tasks
3. **Cache results** - Cache expensive operations when appropriate
4. **Profile plugins** - Monitor plugin performance impact
## Troubleshooting
### Common Issues
1. **Plugin not loading** - Check plugin directory and file permissions
2. **API version mismatch** - Ensure plugin API version matches deb-mock
3. **Hook not firing** - Verify hook stage names and registration
4. **Configuration errors** - Check YAML syntax and plugin configuration
### Debugging
1. **Enable debug logging** - Use `--debug` flag for verbose output
2. **Check plugin info** - Use `plugin-info` command to verify plugin loading
3. **Verify hooks** - Use `list-hooks` to check hook registration
4. **Test individually** - Test plugins in isolation before integration
## Examples
### Complete Plugin Example
See `examples/plugins/ccache_plugin.py` for a complete working plugin.
### Configuration Example
See `examples/plugin-config.yaml` for a complete plugin-enabled configuration.
## API Reference
### PluginManager Methods
- `init_plugins(deb_mock)` - Initialize all enabled plugins
- `call_hooks(stage, *args, **kwargs)` - Call hooks for a stage
- `add_hook(stage, function)` - Register a hook function
- `remove_hook(stage, function)` - Remove a hook function
- `get_hooks(stage)` - Get hooks for a stage
- `list_stages()` - List available hook stages
- `get_plugin_info()` - Get plugin system information
### BasePlugin Methods
- `get_config(key, default)` - Get plugin configuration
- `set_config(key, value)` - Set plugin configuration
- `log_info(message)` - Log info message
- `log_warning(message)` - Log warning message
- `log_error(message)` - Log error message
- `log_debug(message)` - Log debug message
## Future Enhancements
- **Plugin repositories** - Centralized plugin distribution
- **Plugin dependencies** - Plugin-to-plugin dependencies
- **Plugin validation** - Automated plugin testing and validation
- **Plugin metrics** - Performance and usage metrics
- **Plugin hot-reload** - Runtime plugin updates

331
docs/SBUILD_INTEGRATION.md Normal file
View file

@ -0,0 +1,331 @@
# Deb-Mock Sbuild Integration
## Overview
The `deb-mock` sbuild integration provides a robust, production-ready interface to Debian's `sbuild` package building system. This integration enables `deb-mock` to build actual Debian packages using the same tooling that Debian developers use in production.
## Features
- **Automatic requirement checking** - Validates sbuild availability, user permissions, and configuration
- **Intelligent error handling** - Provides clear error messages and recovery suggestions
- **Dependency management** - Automatic checking and installation of build dependencies
- **Chroot management** - Update and query chroot information
- **Comprehensive CLI** - Full command-line interface for all sbuild operations
- **Integration with deb-orchestrator** - Seamless integration with the build orchestration system
## Architecture
### Core Components
1. **SbuildWrapper** - Main wrapper class for sbuild operations
2. **Automatic Configuration** - Self-configuring sbuild setup
3. **Error Handling** - Comprehensive error detection and reporting
4. **CLI Integration** - Full command-line interface
### Integration Points
- **deb-mock core** - Integrated into the main build system
- **Plugin system** - Hooks for customizing sbuild behavior
- **deb-orchestrator** - Task execution and result collection
## Requirements
### System Requirements
- **sbuild package** - Debian package building tool
- **schroot** - Chroot management system
- **User permissions** - User must be in the `sbuild` group
### Setup Commands
```bash
# Install sbuild
sudo apt-get install sbuild
# Add user to sbuild group
sudo sbuild-adduser $USER
# Start new shell session or use newgrp
newgrp sbuild
```
## Configuration
### Automatic Configuration
The sbuild integration automatically creates a minimal configuration file at `~/.config/sbuild/config.pl`:
```perl
#!/usr/bin/perl
# deb-mock sbuild configuration
$chroot_mode = "schroot";
$schroot = "schroot";
```
### Manual Configuration
You can override the automatic configuration by creating your own `~/.config/sbuild/config.pl` or `~/.sbuildrc` file.
## Usage
### CLI Commands
#### Chroot Management
```bash
# Show chroot information
deb-mock chroot-info debian-trixie-amd64
# Update chroot packages
deb-mock update-chroot debian-trixie-amd64
```
#### Dependency Management
```bash
# Check build dependencies
deb-mock check-deps /path/to/source-package
# Install build dependencies
deb-mock install-deps package1 package2
```
#### Package Building
```bash
# Build package with sbuild
deb-mock build-with-sbuild /path/to/source-package
# Build with custom options
deb-mock build-with-sbuild /path/to/source-package \
--chroot debian-trixie-amd64 \
--output-dir ./output \
--verbose
```
### Programmatic Usage
```python
from deb_mock.config import Config
from deb_mock.sbuild import SbuildWrapper
# Create configuration
config = Config(
chroot_name="debian-trixie-amd64",
suite="trixie",
architecture="amd64"
)
# Initialize wrapper
wrapper = SbuildWrapper(config)
# Check dependencies
deps = wrapper.check_dependencies("source-package")
if not deps["satisfied"]:
wrapper.install_build_dependencies(deps["missing"])
# Build package
result = wrapper.build_package("source-package")
```
## Error Handling
### Common Issues and Solutions
#### 1. User Not in Sbuild Group
**Error**: `User joe is not currently an effective member of group sbuild`
**Solution**:
```bash
sudo sbuild-adduser $USER
newgrp sbuild # or start new shell session
```
#### 2. Sbuild Not Found
**Error**: `sbuild not found. Please install sbuild package`
**Solution**:
```bash
sudo apt-get install sbuild
```
#### 3. Chroot Not Available
**Error**: `Chroot 'debian-trixie-amd64' not found`
**Solution**:
```bash
# Create chroot using deb-mock
deb-mock init-chroot debian-trixie-amd64
# Or use sbuild directly
sudo sbuild-createchroot --arch=amd64 trixie /var/lib/schroot/chroots/debian-trixie-amd64
```
#### 4. Build Dependencies Missing
**Error**: `Unmet build dependencies`
**Solution**:
```bash
# Check what's missing
deb-mock check-deps /path/to/source-package
# Install missing dependencies
deb-mock install-deps package1 package2
```
### Error Recovery
The integration provides automatic error recovery for common issues:
- **Missing dependencies** - Automatically attempts to install
- **Configuration issues** - Creates minimal working configuration
- **Permission problems** - Provides clear setup instructions
## Integration with deb-orchestrator
### Task Execution
The sbuild integration works seamlessly with `deb-orchestrator`:
1. **Task Creation** - Build tasks are created in the orchestrator
2. **Dependency Resolution** - Build dependencies are automatically checked
3. **Package Building** - sbuild executes the actual package build
4. **Result Collection** - Build artifacts and metadata are collected
5. **Status Reporting** - Build success/failure is reported back
### Example Workflow
```python
# In deb-orchestrator task
task = {
"id": 123,
"type": "build_package",
"source_package": "test-package",
"chroot": "debian-trixie-amd64",
"architecture": "amd64"
}
# deb-mock executes the build
result = deb_mock.build_with_sbuild(
source_package=task["source_package"],
chroot=task["chroot"],
arch=task["architecture"]
)
# Result is reported back to orchestrator
orchestrator.report_build_result(task["id"], result)
```
## Performance and Optimization
### Caching
- **Root cache** - Chroot state caching for faster rebuilds
- **Package cache** - APT package caching
- **ccache** - Compiler caching for faster compilation
### Parallelization
- **Multiple chroots** - Build multiple packages simultaneously
- **Worker processes** - Parallel task execution
- **Resource management** - Efficient resource utilization
## Security
### Isolation
- **Chroot isolation** - Complete filesystem isolation
- **User separation** - Dedicated build users
- **Network control** - Controlled network access
### Permissions
- **Minimal privileges** - Only necessary permissions granted
- **User mapping** - Proper UID/GID handling
- **Capability dropping** - Security capability management
## Monitoring and Debugging
### Logging
- **Build logs** - Complete sbuild output capture
- **Error logs** - Detailed error information
- **Performance metrics** - Build time and resource usage
### Debugging
```bash
# Enable verbose output
deb-mock build-with-sbuild package --verbose --debug
# Check chroot status
deb-mock chroot-info chroot-name
# Verify dependencies
deb-mock check-deps package
```
## Best Practices
### Configuration
1. **Use dedicated chroots** - Separate chroots for different distributions
2. **Regular updates** - Keep chroots updated with latest packages
3. **Resource limits** - Set appropriate memory and disk limits
### Build Process
1. **Dependency checking** - Always check dependencies before building
2. **Clean builds** - Use clean chroots for reproducible builds
3. **Artifact collection** - Properly collect and store build artifacts
### Error Handling
1. **Graceful degradation** - Handle errors without breaking the build system
2. **User feedback** - Provide clear error messages and solutions
3. **Recovery mechanisms** - Automatic recovery when possible
## Troubleshooting
### Debug Mode
Enable debug mode for detailed information:
```bash
export DEB_MOCK_DEBUG=1
deb-mock build-with-sbuild package --debug
```
### Common Problems
1. **Permission denied** - Check user group membership
2. **Chroot not found** - Verify chroot exists and is accessible
3. **Build failures** - Check build logs for specific errors
4. **Dependency issues** - Verify package availability in chroot
### Getting Help
- **Error messages** - Read error messages carefully for solutions
- **Build logs** - Check build logs for detailed error information
- **Documentation** - Refer to this documentation and sbuild man pages
- **Community** - Check deb-mock project issues and discussions
## Future Enhancements
### Planned Features
- **Multi-architecture support** - Cross-compilation and multi-arch builds
- **Advanced caching** - Intelligent cache management and optimization
- **Build profiling** - Performance analysis and optimization suggestions
- **Integration testing** - Automated testing of build workflows
### Extension Points
- **Plugin system** - Custom build hooks and modifications
- **Custom backends** - Alternative build system support
- **Monitoring integration** - Integration with monitoring and alerting systems
- **CI/CD integration** - Continuous integration and deployment support

531
docs/TESTING.md Normal file
View file

@ -0,0 +1,531 @@
# Deb-Mock Testing Guide
## Overview
The `deb-mock` project includes a comprehensive test suite that covers all major functionality including core operations, performance monitoring, plugin system, and integration testing. This guide provides detailed information on running tests, understanding test coverage, and contributing to the test suite.
## Test Structure
### Test Organization
```
tests/
├── __init__.py # Test package initialization
├── conftest.py # Pytest configuration and fixtures
├── test_core.py # Core functionality tests
├── test_performance.py # Performance monitoring tests
├── test_plugin_system.py # Plugin system tests
└── requirements.txt # Test dependencies
```
### Test Categories
1. **Unit Tests** - Test individual components in isolation
2. **Integration Tests** - Test component interactions
3. **Performance Tests** - Test performance monitoring system
4. **Plugin Tests** - Test plugin system functionality
5. **System Tests** - Test end-to-end workflows
## Running Tests
### Prerequisites
1. **Python Virtual Environment**: Ensure you have activated the virtual environment
```bash
source venv/bin/activate
```
2. **Test Dependencies**: Install required testing packages
```bash
pip install -r tests/requirements.txt
```
### Basic Test Execution
#### Run All Tests
```bash
python -m pytest tests/
```
#### Run Specific Test File
```bash
python -m pytest tests/test_core.py
```
#### Run Specific Test Class
```bash
python -m pytest tests/test_performance.py::TestPerformanceMonitor
```
#### Run Specific Test Method
```bash
python -m pytest tests/test_performance.py::TestPerformanceMonitor::test_initialization
```
### Using the Test Runner Script
The project includes a comprehensive test runner script that provides additional functionality:
#### Run All Tests with Coverage
```bash
python run_tests.py --all --coverage-report
```
#### Run Specific Test Types
```bash
# Unit tests only
python run_tests.py --unit
# Integration tests only
python run_tests.py --integration
# Performance tests only
python run_tests.py --performance
# Plugin system tests only
python run_tests.py --plugin
```
#### Parallel Test Execution
```bash
python run_tests.py --all --parallel
```
#### Verbose Output
```bash
python run_tests.py --all --verbose
```
#### Additional Quality Checks
```bash
# Run linting
python run_tests.py --lint
# Run type checking
python run_tests.py --type-check
# Run security scanning
python run_tests.py --security
```
### Test Runner Options
| Option | Description |
|--------|-------------|
| `--unit` | Run unit tests only |
| `--integration` | Run integration tests only |
| `--performance` | Run performance tests only |
| `--plugin` | Run plugin system tests only |
| `--all` | Run all tests |
| `--parallel` | Run tests in parallel |
| `--no-coverage` | Disable coverage reporting |
| `--verbose`, `-v` | Verbose output |
| `--install-deps` | Install test dependencies |
| `--lint` | Run code linting |
| `--type-check` | Run type checking |
| `--security` | Run security scanning |
| `--coverage-report` | Generate coverage report |
## Test Configuration
### Pytest Configuration (`pytest.ini`)
```ini
[tool:pytest]
testpaths = tests
python_files = test_*.py
python_classes = Test*
python_functions = test_*
addopts =
-v
--tb=short
--strict-markers
--disable-warnings
--cov=deb_mock
--cov-report=term-missing
--cov-report=html:htmlcov
--cov-report=xml:coverage.xml
--cov-fail-under=80
markers =
slow: marks tests as slow
integration: marks tests as integration tests
unit: marks tests as unit tests
performance: marks tests as performance tests
plugin: marks tests as plugin system tests
```
### Coverage Configuration
- **Minimum Coverage**: 80%
- **Coverage Reports**: Terminal, HTML, XML
- **Coverage Output**: `htmlcov/` directory
## Test Fixtures
### Common Fixtures (`conftest.py`)
The test suite provides comprehensive fixtures for testing:
#### Configuration Fixtures
- `test_config` - Basic test configuration
- `performance_test_config` - Configuration with performance monitoring
- `plugin_test_config` - Configuration with plugin support
#### Mock Fixtures
- `mock_chroot_manager` - Mock chroot manager
- `mock_cache_manager` - Mock cache manager
- `mock_sbuild_wrapper` - Mock sbuild wrapper
- `mock_plugin_manager` - Mock plugin manager
- `mock_performance_monitor` - Mock performance monitor
#### Test Data Fixtures
- `sample_source_package` - Minimal Debian source package
- `test_package_data` - Package metadata for testing
- `test_build_result` - Build result data
- `test_performance_metrics` - Performance metrics data
#### Environment Fixtures
- `temp_dir` - Temporary directory for tests
- `test_environment` - Test environment variables
- `isolated_filesystem` - Isolated filesystem for testing
## Test Categories
### 1. Core Functionality Tests (`test_core.py`)
Tests the main `DebMock` class and its core operations:
- **Initialization** - Component initialization and configuration
- **Build Operations** - Package building with various scenarios
- **Chroot Management** - Chroot creation, restoration, and cleanup
- **Cache Operations** - Cache restoration and creation
- **Plugin Integration** - Hook execution and plugin lifecycle
- **Performance Monitoring** - Performance tracking integration
- **Error Handling** - Build failures and error scenarios
#### Example Test
```python
def test_build_with_existing_chroot(self, mock_deb_mock, sample_source_package,
mock_chroot_manager, mock_cache_manager,
mock_sbuild_wrapper, mock_plugin_manager,
mock_performance_monitor):
"""Test building with an existing chroot"""
# Mock the components
mock_deb_mock.chroot_manager = mock_chroot_manager
mock_deb_mock.cache_manager = mock_cache_manager
mock_deb_mock.sbuild_wrapper = mock_sbuild_wrapper
mock_deb_mock.plugin_manager = mock_plugin_manager
mock_deb_mock.performance_monitor = mock_performance_monitor
# Mock chroot exists
mock_chroot_manager.chroot_exists.return_value = True
# Run build
result = mock_deb_mock.build(sample_source_package)
# Verify result
assert result["success"] is True
```
### 2. Performance Monitoring Tests (`test_performance.py`)
Tests the performance monitoring and optimization system:
- **PerformanceMetrics** - Metrics data structure validation
- **BuildProfile** - Build performance profile management
- **PerformanceMonitor** - Real-time monitoring and metrics collection
- **PerformanceOptimizer** - AI-driven optimization suggestions
- **PerformanceReporter** - Report generation and data export
#### Example Test
```python
def test_monitor_operation_context_manager(self, test_config):
"""Test monitor_operation context manager"""
test_config.enable_performance_monitoring = True
monitor = PerformanceMonitor(test_config)
with monitor.monitor_operation("test_op") as op_id:
assert op_id.startswith("test_op_")
time.sleep(0.1) # Small delay
# Verify operation was tracked
assert len(monitor._operation_history) == 1
assert monitor._operation_history[0].operation == "test_op"
assert monitor._operation_history[0].duration > 0
```
### 3. Plugin System Tests (`test_plugin_system.py`)
Tests the extensible plugin system:
- **HookStages** - Hook stage definitions and values
- **BasePlugin** - Base plugin class functionality
- **PluginManager** - Plugin discovery, loading, and management
- **Plugin Lifecycle** - Initialization, execution, and cleanup
- **Hook System** - Hook registration and execution
- **Error Handling** - Plugin error scenarios
#### Example Test
```python
def test_plugin_lifecycle(self, test_config):
"""Test complete plugin lifecycle"""
manager = PluginManager(test_config)
# Create a test plugin
class TestPlugin(BasePlugin):
def __init__(self):
super().__init__(
name="TestPlugin",
version="1.0.0",
description="Test plugin for integration testing"
)
self.init_called = False
self.cleanup_called = False
def init(self, deb_mock):
self.init_called = True
return None
def cleanup(self):
self.cleanup_called = True
return None
# Test plugin lifecycle
plugin = TestPlugin()
manager.plugins["test_plugin"] = plugin
# Initialize
mock_deb_mock = Mock()
result = manager.init_plugins(mock_deb_mock)
assert result is True
assert plugin.init_called is True
# Cleanup
cleanup_result = manager.cleanup_plugins()
assert cleanup_result is True
assert plugin.cleanup_called is True
```
## Test Markers
### Available Markers
- **`@pytest.mark.slow`** - Marks tests as slow (can be deselected)
- **`@pytest.mark.integration`** - Marks tests as integration tests
- **`@pytest.mark.unit`** - Marks tests as unit tests
- **`@pytest.mark.performance`** - Marks tests as performance tests
- **`@pytest.mark.plugin`** - Marks tests as plugin system tests
### Using Markers
#### Run Only Fast Tests
```bash
python -m pytest -m "not slow"
```
#### Run Only Integration Tests
```bash
python -m pytest -m integration
```
#### Run Multiple Marker Types
```bash
python -m pytest -m "unit or performance"
```
## Coverage Reporting
### Coverage Types
1. **Terminal Coverage** - Inline coverage information
2. **HTML Coverage** - Detailed HTML report in `htmlcov/` directory
3. **XML Coverage** - Machine-readable coverage data
### Coverage Thresholds
- **Minimum Coverage**: 80%
- **Coverage Failure**: Tests fail if coverage drops below threshold
### Generating Coverage Reports
```bash
# Generate all coverage reports
python run_tests.py --coverage-report
# Generate specific coverage report
python -m coverage report
python -m coverage html
```
## Test Data Management
### Temporary Files
Tests use temporary directories that are automatically cleaned up:
```python
@pytest.fixture
def temp_dir():
"""Create a temporary directory for tests"""
temp_dir = tempfile.mkdtemp(prefix="deb_mock_test_")
yield temp_dir
shutil.rmtree(temp_dir, ignore_errors=True)
```
### Mock Data
Tests use realistic mock data for comprehensive testing:
```python
@pytest.fixture
def sample_source_package(temp_dir):
"""Create a minimal Debian source package for testing"""
package_dir = os.path.join(temp_dir, "test-package")
os.makedirs(package_dir)
# Create debian/control
debian_dir = os.path.join(package_dir, "debian")
os.makedirs(debian_dir)
# Add package files...
return package_dir
```
## Debugging Tests
### Verbose Output
```bash
python -m pytest -v -s tests/
```
### Debugging Specific Tests
```bash
# Run with debugger
python -m pytest --pdb tests/test_core.py::TestDebMock::test_build
# Run with trace
python -m pytest --trace tests/test_core.py::TestDebMock::test_build
```
### Test Isolation
```bash
# Run single test in isolation
python -m pytest -x tests/test_core.py::TestDebMock::test_build
# Stop on first failure
python -m pytest -x tests/
```
## Continuous Integration
### CI/CD Integration
The test suite is designed for CI/CD environments:
```yaml
# GitHub Actions example
- name: Run Tests
run: |
source venv/bin/activate
python run_tests.py --all --coverage-report --parallel
- name: Upload Coverage
uses: codecov/codecov-action@v3
with:
file: ./coverage.xml
```
### Test Parallelization
Tests can be run in parallel for faster execution:
```bash
# Auto-detect CPU cores
python -m pytest -n auto tests/
# Specific number of workers
python -m pytest -n 4 tests/
```
## Best Practices
### Writing Tests
1. **Test Naming** - Use descriptive test names that explain the scenario
2. **Test Isolation** - Each test should be independent and not affect others
3. **Mock External Dependencies** - Use mocks for system calls and external services
4. **Test Data** - Use realistic test data that represents real scenarios
5. **Error Scenarios** - Test both success and failure cases
### Test Organization
1. **Group Related Tests** - Use test classes to group related functionality
2. **Use Fixtures** - Leverage pytest fixtures for common setup
3. **Test Categories** - Use markers to categorize tests
4. **Coverage** - Aim for high test coverage (80% minimum)
### Performance Testing
1. **Realistic Scenarios** - Test with realistic data sizes and complexity
2. **Benchmarking** - Use the performance monitoring system for benchmarks
3. **Resource Monitoring** - Monitor CPU, memory, and I/O during tests
4. **Regression Detection** - Detect performance regressions
## Troubleshooting
### Common Issues
#### Import Errors
```bash
# Ensure virtual environment is activated
source venv/bin/activate
# Install test dependencies
pip install -r tests/requirements.txt
```
#### Coverage Issues
```bash
# Clear coverage data
python -m coverage erase
# Run tests with coverage
python -m pytest --cov=deb_mock tests/
```
#### Test Failures
```bash
# Run with verbose output
python -m pytest -v -s tests/
# Run specific failing test
python -m pytest tests/test_core.py::TestDebMock::test_build -v -s
```
### Getting Help
1. **Check Test Output** - Review test output for error details
2. **Review Fixtures** - Ensure test fixtures are properly configured
3. **Check Dependencies** - Verify all test dependencies are installed
4. **Review Configuration** - Check pytest.ini and test configuration
## Contributing to Tests
### Adding New Tests
1. **Follow Naming Convention** - Use `test_*.py` for test files
2. **Use Existing Fixtures** - Leverage existing fixtures when possible
3. **Add Markers** - Use appropriate test markers
4. **Maintain Coverage** - Ensure new code is covered by tests
### Test Review Process
1. **Test Coverage** - Ensure new functionality has adequate test coverage
2. **Test Quality** - Tests should be clear, maintainable, and reliable
3. **Performance Impact** - Tests should not significantly impact build times
4. **Documentation** - Document complex test scenarios and edge cases
This comprehensive testing guide ensures that the `deb-mock` project maintains high quality and reliability through extensive testing coverage.