7.5 KiB
7.5 KiB
""" EU-Utility Test Suite - Summary
This comprehensive test suite provides full coverage for EU-Utility v2.0.
Test Structure
tests/
├── __init__.py # Test package initialization
├── conftest.py # Shared fixtures and configuration
├── run_tests.py # Test runner script
├── unit/ # Unit tests
│ ├── test_plugin_manager.py
│ ├── test_window_manager.py
│ ├── test_api_integration.py
│ └── test_core_services.py
├── integration/ # Integration tests
│ └── test_plugin_workflows.py
├── ui/ # UI automation tests
│ └── test_ui_automation.py
└── performance/ # Performance benchmarks
└── test_benchmarks.py
Test Coverage
Unit Tests (30+ tests)
-
Plugin Manager Tests
- Initialization and configuration
- Plugin discovery and loading
- Enable/disable functionality
- Settings persistence
- Dependency management
-
Window Manager Tests
- Singleton pattern
- Window detection
- Focus tracking
- Multi-monitor support
- Activity bar functionality
-
API Integration Tests
- Plugin API singleton
- Service registration
- Log reading
- Window operations
- OCR functionality
- Screenshot capture
- Nexus API
- HTTP client
- Audio/Notifications
- Clipboard operations
- Event bus
- Data store
- Background tasks
-
Core Services Tests
- Event bus (subscribe/unsubscribe/publish)
- Data store (CRUD operations, persistence)
- Settings (get/set, persistence)
- Logger
- Hotkey manager
- Theme manager
- Performance optimizations
Integration Tests (20+ tests)
-
Plugin Lifecycle Tests
- Full plugin lifecycle
- Enable/disable workflow
- Settings persistence across sessions
-
API Workflow Tests
- Log reading and parsing
- Window detection and overlay positioning
- OCR and notification workflow
- Nexus search and data storage
- Event subscription and publishing
-
UI Integration Tests
- Overlay show/hide workflow
- Plugin switching
- Dashboard widget workflow
-
Settings Workflow Tests
- Save/load workflow
- Plugin settings isolation
-
Error Handling Tests
- Plugin load error handling
- API service unavailable handling
- Graceful degradation
UI Automation Tests (25+ tests)
-
Dashboard UI Tests
- Dashboard opens correctly
- Widget interaction
- Navigation tabs
-
Overlay Window Tests
- Window opens correctly
- Toggle visibility
- Plugin navigation
-
Activity Bar Tests
- Opens correctly
- Search functionality
- Auto-hide behavior
-
Settings Dialog Tests
- Dialog opens
- Save functionality
-
Responsive UI Tests
- Window resize handling
- Minimum size enforcement
- Sidebar responsiveness
-
Theme UI Tests
- Theme toggle
- Stylesheet application
-
Accessibility Tests
- Accessibility names
- Keyboard navigation
-
Tray Icon Tests
- Icon exists
- Context menu
Performance Benchmarks (15+ tests)
-
Plugin Manager Performance
- Plugin discovery speed
- Plugin load speed
-
API Performance
- Log reading
- Nexus search
- Data store operations
-
UI Performance
- Overlay creation
- Dashboard render
- Plugin switching
-
Memory Performance
- Plugin loading memory
- Data storage memory
-
Startup Performance
- Application startup
- Component initialization
-
Cache Performance
- HTTP caching
- Data store caching
-
Concurrent Performance
- Event publishing
Running Tests
Run All Tests
python run_tests.py --all
Run Specific Categories
# Unit tests only
python run_tests.py --unit
# Integration tests
python run_tests.py --integration
# UI tests
python run_tests.py --ui
# Performance benchmarks
python run_tests.py --performance
With Coverage
python run_tests.py --all --coverage --html
Using pytest directly
# All tests
python -m pytest tests/ -v
# With coverage
python -m pytest tests/ --cov=core --cov=plugins --cov-report=html
# Specific test file
python -m pytest tests/unit/test_plugin_manager.py -v
# Specific test
python -m pytest tests/unit/test_plugin_manager.py::TestPluginManager::test_plugin_manager_initialization -v
# By marker
python -m pytest tests/ -m "not slow" # Skip slow tests
python -m pytest tests/ -m integration
python -m pytest tests/ -m ui
Test Markers
slow: Tests that take longer to runintegration: Integration testsui: UI automation testswindows_only: Windows-specific tests
Fixtures
Available Fixtures
temp_dir: Temporary directory for test filesmock_overlay: Mock overlay windowmock_plugin_manager: Mock plugin managermock_qt_app: Mock Qt applicationsample_config: Sample configurationmock_nexus_response: Sample Nexus API responsemock_window_info: Mock window informationmock_ocr_result: Sample OCR resultsample_log_lines: Sample game log linesevent_bus: Fresh event bus instancedata_store: Temporary data storemock_http_client: Mock HTTP clienttest_logger: Test logger
CI/CD Integration
GitHub Actions Example
name: Tests
on: [push, pull_request]
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['3.11', '3.12']
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
pip install -r requirements.txt
pip install -r requirements-dev.txt
- name: Run tests
run: python run_tests.py --unit --coverage --xml
- name: Upload coverage
uses: codecov/codecov-action@v3
Test Maintenance
Adding New Tests
- Create test file in appropriate directory
- Use descriptive test names
- Add docstrings explaining what is tested
- Use fixtures from conftest.py
- Add markers if appropriate
- Run tests to verify
Test Naming Convention
test_<component>_<scenario>_<expected_result>- Example:
test_plugin_manager_enable_plugin_success
Best Practices
- Isolation: Each test should be independent
- Determinism: Tests should produce same results every time
- Speed: Keep tests fast (use mocks)
- Clarity: Tests should be easy to understand
- Coverage: Aim for high code coverage
Coverage Goals
| Component | Target Coverage |
|---|---|
| Core Services | 90%+ |
| Plugin Manager | 85%+ |
| API Layer | 80%+ |
| UI Components | 70%+ |
| Overall | 80%+ |
Known Limitations
- UI tests require display (Xvfb on headless systems)
- Some tests are Windows-only (window manager)
- OCR tests require OCR backend installed
- Performance benchmarks may vary by hardware
Troubleshooting Tests
Tests Fail to Import
# Ensure you're in project root
cd /path/to/EU-Utility
python -m pytest tests/ -v
Qt Display Issues (Linux)
# Install Xvfb
sudo apt install xvfb
# Run with virtual display
xvfb-run python -m pytest tests/ui/ -v
Permission Errors
chmod -R u+rw tests/
For more information, see: