Initial commit: Complete Fixture Manager daemon system

- Comprehensive Python daemon system for Linux servers
- Secure web dashboard with authentication and authorization
- RESTful API with JWT authentication
- MySQL database connectivity with connection pooling
- Advanced file upload system with real-time progress tracking
- Intelligent CSV/XLSX fixture parsing algorithms
- Two-stage upload workflow (fixture files + ZIP files)
- Full Linux daemon process management with systemd integration
- Complete security implementation with rate limiting and validation
- SHA1 checksum calculation and verification
- Automated installation and deployment scripts
- Comprehensive documentation and configuration management
parents
Pipeline #168 canceled with stages
# Database Configuration
MYSQL_HOST=localhost
MYSQL_PORT=3306
MYSQL_USER=fixture_user
MYSQL_PASSWORD=secure_password_here
MYSQL_DATABASE=fixture_manager
# Security Configuration
SECRET_KEY=your-secret-key-here-change-in-production
JWT_SECRET_KEY=your-jwt-secret-key-here
BCRYPT_LOG_ROUNDS=12
# File Upload Configuration
UPLOAD_FOLDER=/var/lib/fixture-daemon/uploads
MAX_CONTENT_LENGTH=524288000
CHUNK_SIZE=8192
MAX_CONCURRENT_UPLOADS=5
# Daemon Configuration
DAEMON_PID_FILE=/var/run/fixture-daemon.pid
DAEMON_LOG_FILE=/var/log/fixture-daemon.log
DAEMON_WORKING_DIR=/var/lib/fixture-daemon
# Web Server Configuration
HOST=0.0.0.0
PORT=5000
DEBUG=false
# Logging Configuration
LOG_LEVEL=INFO
# JWT Configuration
JWT_ACCESS_TOKEN_EXPIRES=3600
\ No newline at end of file
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
.python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# IDE
.vscode/
.idea/
*.swp
*.swo
*~
# OS
.DS_Store
.DS_Store?
._*
.Spotlight-V100
.Trashes
ehthumbs.db
Thumbs.db
# Project specific
uploads/
logs/
backups/
*.pid
*.sock
\ No newline at end of file
# Fixture Manager - Comprehensive Python Daemon System
A sophisticated Python daemon system for Linux servers with internet exposure, implementing a secure web dashboard and RESTful API with robust authentication mechanisms. The system provides advanced file upload capabilities with real-time progress tracking and a comprehensive fixture management system.
## Features
### Core Functionality
- **Secure Web Dashboard**: Modern web interface with authentication and authorization
- **RESTful API**: Comprehensive API with JWT authentication
- **MySQL Database Integration**: Robust database connectivity with connection pooling
- **Advanced File Upload System**: Real-time progress tracking with SHA1 checksum verification
- **Dual-Format Support**: Intelligent parsing of CSV/XLSX fixture files
- **Two-Stage Upload Workflow**: Fixture files followed by mandatory ZIP uploads
- **Daemon Process Management**: Full Linux daemon with systemd integration
### Security Features
- **Multi-layer Authentication**: Session-based and JWT token authentication
- **Rate Limiting**: Protection against brute force attacks
- **File Validation**: Comprehensive security checks and malicious content detection
- **SQL Injection Protection**: Parameterized queries and ORM usage
- **CSRF Protection**: Cross-site request forgery prevention
- **Security Headers**: Comprehensive HTTP security headers
- **Input Sanitization**: All user inputs are validated and sanitized
### Database Schema
- **Normalized Design**: Optimized relational database structure
- **Primary Matches Table**: Core fixture data with system fields
- **Secondary Outcomes Table**: Dynamic result columns with foreign key relationships
- **File Upload Tracking**: Complete upload lifecycle management
- **System Logging**: Comprehensive audit trail
- **Session Management**: Secure user session handling
## Installation
### Prerequisites
- Linux server (Ubuntu 18.04+, CentOS 7+, or similar)
- Python 3.8+
- MySQL 5.7+ or MariaDB 10.3+
- Root or sudo access
### Quick Installation
```bash
# Clone the repository
git clone <repository-url>
cd fixture-manager
# Make installation script executable
chmod +x install.sh
# Run installation (as root)
sudo ./install.sh
```
### Manual Installation
1. **Install System Dependencies**:
```bash
# Ubuntu/Debian
apt-get update
apt-get install python3 python3-pip python3-venv mysql-server nginx supervisor
# CentOS/RHEL
yum install python3 python3-pip mysql-server nginx supervisor
```
2. **Create System User**:
```bash
useradd --system --home-dir /var/lib/fixture-daemon fixture
```
3. **Install Python Dependencies**:
```bash
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
```
4. **Configure Database**:
```bash
mysql -u root -p < database/schema.sql
```
5. **Configure Environment**:
```bash
cp .env.example .env
# Edit .env with your configuration
```
## Configuration
### Environment Variables
The system uses environment variables for configuration. Key settings include:
```bash
# Database Configuration
MYSQL_HOST=localhost
MYSQL_PORT=3306
MYSQL_USER=fixture_user
MYSQL_PASSWORD=secure_password
MYSQL_DATABASE=fixture_manager
# Security Configuration
SECRET_KEY=your-secret-key-here
JWT_SECRET_KEY=your-jwt-secret-key
BCRYPT_LOG_ROUNDS=12
# File Upload Configuration
UPLOAD_FOLDER=/var/lib/fixture-daemon/uploads
MAX_CONTENT_LENGTH=524288000 # 500MB
MAX_CONCURRENT_UPLOADS=5
# Server Configuration
HOST=0.0.0.0
PORT=5000
DEBUG=false
```
### Database Schema
The system automatically creates the following tables:
- `users` - User authentication and management
- `matches` - Core fixture data with system fields
- `match_outcomes` - Dynamic outcome results
- `file_uploads` - Upload tracking and progress
- `system_logs` - Comprehensive logging
- `user_sessions` - Session management
## Usage
### Daemon Management
```bash
# Start the daemon
sudo systemctl start fixture-daemon
# Stop the daemon
sudo systemctl stop fixture-daemon
# Restart the daemon
sudo systemctl restart fixture-daemon
# Check status
sudo systemctl status fixture-daemon
# View logs
journalctl -u fixture-daemon -f
```
### Direct Daemon Control
```bash
# Start in foreground (for debugging)
python daemon.py start --foreground
# Start as daemon
python daemon.py start
# Stop daemon
python daemon.py stop
# Restart daemon
python daemon.py restart
# Check status
python daemon.py status
# Reload configuration
python daemon.py reload
```
### Web Interface
Access the web dashboard at `http://your-server-ip/`
**Default Credentials**:
- Username: `admin`
- Password: `admin123`
**⚠️ Important**: Change the default password immediately after installation!
### API Usage
#### Authentication
```bash
# Login and get JWT token
curl -X POST http://your-server/auth/api/login \
-H "Content-Type: application/json" \
-d '{"username": "admin", "password": "admin123"}'
```
#### Upload Fixture File
```bash
# Upload CSV/XLSX fixture file
curl -X POST http://your-server/upload/api/fixture \
-H "Authorization: Bearer YOUR_JWT_TOKEN" \
-F "file=@fixtures.csv"
```
#### Upload ZIP File
```bash
# Upload ZIP file for specific match
curl -X POST http://your-server/upload/api/zip/123 \
-H "Authorization: Bearer YOUR_JWT_TOKEN" \
-F "file=@match_data.zip"
```
#### Get Matches
```bash
# Get all matches with pagination
curl -X GET "http://your-server/api/matches?page=1&per_page=20" \
-H "Authorization: Bearer YOUR_JWT_TOKEN"
```
## File Format Requirements
### Fixture Files (CSV/XLSX)
**Required Columns**:
- `Match #` (integer) - Unique match identifier
- `Fighter1 (Township)` (varchar255) - First fighter details
- `Fighter2 (Township)` (varchar255) - Second fighter details
- `Venue (Kampala Township)` (varchar255) - Match venue
**Optional Columns**:
- Any numeric columns will be automatically detected as outcome results
- Values must be numeric (float with 2-decimal precision)
**Example CSV**:
```csv
Match #,Fighter1 (Township),Fighter2 (Township),Venue (Kampala Township),Score1,Score2,Duration
1,John Doe (Central),Jane Smith (North),Stadium A (Kampala),85.5,92.3,12.5
2,Mike Johnson (East),Sarah Wilson (West),Arena B (Kampala),78.2,81.7,15.2
```
### ZIP Files
- Must be uploaded after fixture file processing
- Associated with specific match records
- Triggers match activation upon successful upload
- SHA1 checksum verification for integrity
## Architecture
### System Components
1. **Flask Web Application**: Core web framework with blueprints
2. **SQLAlchemy ORM**: Database abstraction and management
3. **JWT Authentication**: Stateless API authentication
4. **File Upload Handler**: Chunked uploads with progress tracking
5. **Fixture Parser**: Intelligent CSV/XLSX parsing
6. **Security Layer**: Multi-layer security implementation
7. **Logging System**: Comprehensive audit and monitoring
8. **Daemon Manager**: Linux daemon process management
### Security Architecture
- **Authentication**: Multi-factor with session and JWT support
- **Authorization**: Role-based access control (RBAC)
- **Input Validation**: Comprehensive sanitization and validation
- **File Security**: Malicious content detection and quarantine
- **Network Security**: Rate limiting and DDoS protection
- **Data Protection**: Encryption at rest and in transit
### Database Design
- **Normalized Schema**: Third normal form compliance
- **Foreign Key Constraints**: Referential integrity
- **Indexing Strategy**: Optimized query performance
- **Transaction Management**: ACID compliance
- **Connection Pooling**: Efficient resource utilization
## Monitoring and Maintenance
### Log Files
- **Application Logs**: `/var/log/fixture-daemon.log`
- **System Logs**: `journalctl -u fixture-daemon`
- **Database Logs**: MySQL error logs
- **Web Server Logs**: Nginx access/error logs
### Health Monitoring
```bash
# Check system health
curl http://your-server/health
# Get system statistics
curl -H "Authorization: Bearer TOKEN" http://your-server/api/statistics
```
### Backup and Recovery
```bash
# Manual backup
/opt/fixture-manager/backup.sh
# Restore from backup
mysql -u fixture_user -p fixture_manager < backup.sql
```
### Maintenance Tasks
The daemon automatically performs:
- **Session Cleanup**: Expired sessions removed hourly
- **Log Rotation**: Old logs archived daily
- **File Cleanup**: Failed uploads cleaned every 6 hours
- **Database Optimization**: Statistics updated nightly
## Troubleshooting
### Common Issues
1. **Database Connection Failed**
```bash
# Check MySQL service
systemctl status mysql
# Verify credentials
mysql -u fixture_user -p
```
2. **File Upload Errors**
```bash
# Check permissions
ls -la /var/lib/fixture-daemon/uploads
# Check disk space
df -h
```
3. **Daemon Won't Start**
```bash
# Check logs
journalctl -u fixture-daemon -n 50
# Test configuration
python daemon.py start --foreground
```
4. **Permission Denied**
```bash
# Fix ownership
chown -R fixture:fixture /var/lib/fixture-daemon
# Fix permissions
chmod 755 /opt/fixture-manager
```
### Debug Mode
```bash
# Run in debug mode
export DEBUG=true
python daemon.py start --foreground --config development
```
## API Documentation
### Authentication Endpoints
- `POST /auth/api/login` - User login
- `POST /auth/api/logout` - User logout
- `POST /auth/api/refresh` - Refresh JWT token
- `GET /auth/api/profile` - Get user profile
### Upload Endpoints
- `POST /upload/api/fixture` - Upload fixture file
- `POST /upload/api/zip/{match_id}` - Upload ZIP file
- `GET /upload/api/progress/{upload_id}` - Get upload progress
- `GET /upload/api/uploads` - List user uploads
### Match Management
- `GET /api/matches` - List matches with pagination
- `GET /api/matches/{id}` - Get match details
- `PUT /api/matches/{id}` - Update match
- `DELETE /api/matches/{id}` - Delete match (admin)
### Administration
- `GET /api/admin/users` - List users (admin)
- `PUT /api/admin/users/{id}` - Update user (admin)
- `GET /api/admin/logs` - System logs (admin)
- `GET /api/admin/system-info` - System information (admin)
## Performance Optimization
### Database Optimization
- Connection pooling with 10 connections
- Query optimization with proper indexing
- Prepared statements for security
- Transaction batching for bulk operations
### File Upload Optimization
- Chunked uploads for large files
- Concurrent upload support (configurable)
- Progress tracking with minimal overhead
- Automatic cleanup of failed uploads
### Caching Strategy
- Session caching with Redis (optional)
- Static file caching with Nginx
- Database query result caching
- API response caching for read-heavy endpoints
## Security Considerations
### Production Deployment
1. **Change Default Credentials**: Update admin password immediately
2. **SSL/TLS Configuration**: Enable HTTPS with valid certificates
3. **Firewall Configuration**: Restrict access to necessary ports only
4. **Regular Updates**: Keep system and dependencies updated
5. **Backup Strategy**: Implement regular automated backups
6. **Monitoring**: Set up comprehensive monitoring and alerting
### Security Best Practices
- Regular security audits
- Penetration testing
- Vulnerability scanning
- Access log monitoring
- Incident response procedures
## Contributing
1. Fork the repository
2. Create a feature branch
3. Make your changes
4. Add tests for new functionality
5. Submit a pull request
## License
This project is licensed under the MIT License - see the LICENSE file for details.
## Support
For support and questions:
- Check the troubleshooting section
- Review system logs
- Contact system administrator
---
**Version**: 1.0.0
**Last Updated**: 2025-08-18
**Minimum Requirements**: Python 3.8+, MySQL 5.7+, Linux Kernel 3.10+
\ No newline at end of file
import os
import logging
from flask import Flask
from flask_sqlalchemy import SQLAlchemy
from flask_login import LoginManager
from flask_jwt_extended import JWTManager
from config import config
import colorlog
# Initialize extensions
db = SQLAlchemy()
login_manager = LoginManager()
jwt = JWTManager()
def create_app(config_name=None):
"""Application factory pattern"""
if config_name is None:
config_name = os.environ.get('FLASK_ENV', 'default')
app = Flask(__name__)
app.config.from_object(config[config_name])
config[config_name].init_app(app)
# Initialize extensions
db.init_app(app)
login_manager.init_app(app)
jwt.init_app(app)
# Configure login manager
login_manager.login_view = 'auth.login'
login_manager.login_message = 'Please log in to access this page.'
login_manager.login_message_category = 'info'
# Configure logging
setup_logging(app)
# Register blueprints
from app.auth import bp as auth_bp
app.register_blueprint(auth_bp, url_prefix='/auth')
from app.api import bp as api_bp
app.register_blueprint(api_bp, url_prefix='/api')
from app.main import bp as main_bp
app.register_blueprint(main_bp)
from app.upload import bp as upload_bp
app.register_blueprint(upload_bp, url_prefix='/upload')
# Create database tables
with app.app_context():
db.create_all()
return app
def setup_logging(app):
"""Setup application logging with colors for development"""
if not app.debug and not app.testing:
# Production logging
if not os.path.exists('logs'):
os.mkdir('logs')
file_handler = logging.handlers.RotatingFileHandler(
'logs/fixture-daemon.log', maxBytes=10240, backupCount=10
)
file_handler.setFormatter(logging.Formatter(
'%(asctime)s %(levelname)s: %(message)s [in %(pathname)s:%(lineno)d]'
))
file_handler.setLevel(logging.INFO)
app.logger.addHandler(file_handler)
app.logger.setLevel(logging.INFO)
app.logger.info('Fixture Daemon startup')
else:
# Development logging with colors
handler = colorlog.StreamHandler()
handler.setFormatter(colorlog.ColoredFormatter(
'%(log_color)s%(asctime)s - %(name)s - %(levelname)s - %(message)s',
datefmt='%Y-%m-%d %H:%M:%S',
log_colors={
'DEBUG': 'cyan',
'INFO': 'green',
'WARNING': 'yellow',
'ERROR': 'red',
'CRITICAL': 'red,bg_white',
}
))
app.logger.addHandler(handler)
app.logger.setLevel(logging.DEBUG)
@login_manager.user_loader
def load_user(user_id):
"""Load user for Flask-Login"""
from app.models import User
return User.query.get(int(user_id))
\ No newline at end of file
from flask import Blueprint
bp = Blueprint('api', __name__)
from app.api import routes
\ No newline at end of file
import logging
from datetime import datetime
from flask import request, jsonify, current_app
from flask_jwt_extended import jwt_required, get_jwt_identity
from sqlalchemy import func, desc
from app.api import bp
from app import db
from app.models import Match, FileUpload, User, SystemLog, MatchOutcome, UserSession
from app.utils.security import require_admin, require_active_user
from app.utils.logging import log_api_request
from app.database import get_db_manager
from app.upload.file_handler import get_file_upload_handler
from app.upload.fixture_parser import get_fixture_parser
logger = logging.getLogger(__name__)
@bp.route('/matches', methods=['GET'])
@jwt_required()
def api_get_matches():
"""Get matches with pagination and filtering"""
try:
user_id = get_jwt_identity()
user = User.query.get(user_id)
if not user or not user.is_active:
return jsonify({'error': 'User not found or inactive'}), 404
# Pagination parameters
page = request.args.get('page', 1, type=int)
per_page = min(request.args.get('per_page', 20, type=int), 100)
# Filtering parameters
status_filter = request.args.get('status')
search_query = request.args.get('search', '').strip()
active_only = request.args.get('active_only', 'false').lower() == 'true'
# Base query
if user.is_admin:
query = Match.query
else:
query = Match.query.filter_by(created_by=user_id)
# Apply filters
if active_only:
query = query.filter_by(active_status=True)
if status_filter == 'active':
query = query.filter_by(active_status=True)
elif status_filter == 'pending':
query = query.filter_by(active_status=False)
elif status_filter == 'zip_pending':
query = query.filter_by(zip_upload_status='pending')
elif status_filter == 'zip_completed':
query = query.filter_by(zip_upload_status='completed')
elif status_filter == 'zip_failed':
query = query.filter_by(zip_upload_status='failed')
# Search functionality
if search_query:
search_pattern = f"%{search_query}%"
query = query.filter(
db.or_(
Match.fighter1_township.ilike(search_pattern),
Match.fighter2_township.ilike(search_pattern),
Match.venue_kampala_township.ilike(search_pattern),
Match.match_number.like(search_pattern)
)
)
# Sorting
sort_by = request.args.get('sort_by', 'created_at')
sort_order = request.args.get('sort_order', 'desc')
if hasattr(Match, sort_by):
sort_column = getattr(Match, sort_by)
if sort_order == 'asc':
query = query.order_by(sort_column.asc())
else:
query = query.order_by(sort_column.desc())
else:
query = query.order_by(Match.created_at.desc())
# Execute pagination
matches_pagination = query.paginate(
page=page, per_page=per_page, error_out=False
)
# Include outcomes if requested
include_outcomes = request.args.get('include_outcomes', 'false').lower() == 'true'
matches_data = []
for match in matches_pagination.items:
match_dict = match.to_dict(include_outcomes=include_outcomes)
matches_data.append(match_dict)
return jsonify({
'matches': matches_data,
'pagination': {
'page': page,
'pages': matches_pagination.pages,
'per_page': per_page,
'total': matches_pagination.total,
'has_next': matches_pagination.has_next,
'has_prev': matches_pagination.has_prev
},
'filters': {
'status': status_filter,
'search': search_query,
'active_only': active_only,
'sort_by': sort_by,
'sort_order': sort_order
}
}), 200
except Exception as e:
logger.error(f"API get matches error: {str(e)}")
return jsonify({'error': 'Failed to retrieve matches'}), 500
@bp.route('/matches/<int:match_id>', methods=['GET'])
@jwt_required()
def api_get_match(match_id):
"""Get specific match details"""
try:
user_id = get_jwt_identity()
user = User.query.get(user_id)
if not user or not user.is_active:
return jsonify({'error': 'User not found or inactive'}), 404
# Get match
if user.is_admin:
match = Match.query.get(match_id)
else:
match = Match.query.filter_by(id=match_id, created_by=user_id).first()
if not match:
return jsonify({'error': 'Match not found'}), 404
# Get associated uploads
uploads = FileUpload.query.filter_by(match_id=match_id).all()
return jsonify({
'match': match.to_dict(include_outcomes=True),
'uploads': [upload.to_dict() for upload in uploads]
}), 200
except Exception as e:
logger.error(f"API get match error: {str(e)}")
return jsonify({'error': 'Failed to retrieve match'}), 500
@bp.route('/matches/<int:match_id>', methods=['PUT'])
@jwt_required()
def api_update_match(match_id):
"""Update match details"""
try:
user_id = get_jwt_identity()
user = User.query.get(user_id)
if not user or not user.is_active:
return jsonify({'error': 'User not found or inactive'}), 404
# Get match
if user.is_admin:
match = Match.query.get(match_id)
else:
match = Match.query.filter_by(id=match_id, created_by=user_id).first()
if not match:
return jsonify({'error': 'Match not found'}), 404
data = request.get_json()
if not data:
return jsonify({'error': 'No data provided'}), 400
# Update allowed fields
updatable_fields = ['start_time', 'end_time', 'result']
updated_fields = []
for field in updatable_fields:
if field in data:
if field in ['start_time', 'end_time'] and data[field]:
try:
setattr(match, field, datetime.fromisoformat(data[field]))
updated_fields.append(field)
except ValueError:
return jsonify({'error': f'Invalid datetime format for {field}'}), 400
else:
setattr(match, field, data[field])
updated_fields.append(field)
if updated_fields:
match.updated_at = datetime.utcnow()
db.session.commit()
return jsonify({
'message': 'Match updated successfully',
'updated_fields': updated_fields,
'match': match.to_dict()
}), 200
else:
return jsonify({'message': 'No fields to update'}), 200
except Exception as e:
logger.error(f"API update match error: {str(e)}")
db.session.rollback()
return jsonify({'error': 'Failed to update match'}), 500
@bp.route('/matches/<int:match_id>', methods=['DELETE'])
@jwt_required()
def api_delete_match(match_id):
"""Delete match (admin only)"""
try:
user_id = get_jwt_identity()
user = User.query.get(user_id)
if not user or not user.is_admin:
return jsonify({'error': 'Admin privileges required'}), 403
match = Match.query.get(match_id)
if not match:
return jsonify({'error': 'Match not found'}), 404
# Delete associated files
uploads = FileUpload.query.filter_by(match_id=match_id).all()
for upload in uploads:
try:
import os
if os.path.exists(upload.file_path):
os.remove(upload.file_path)
except Exception as e:
logger.warning(f"Failed to delete file {upload.file_path}: {str(e)}")
# Delete match (cascades to outcomes and uploads)
db.session.delete(match)
db.session.commit()
return jsonify({'message': 'Match deleted successfully'}), 200
except Exception as e:
logger.error(f"API delete match error: {str(e)}")
db.session.rollback()
return jsonify({'error': 'Failed to delete match'}), 500
@bp.route('/statistics', methods=['GET'])
@jwt_required()
def api_get_statistics():
"""Get comprehensive statistics"""
try:
user_id = get_jwt_identity()
user = User.query.get(user_id)
if not user or not user.is_active:
return jsonify({'error': 'User not found or inactive'}), 404
# User statistics
user_stats = {
'total_matches': Match.query.filter_by(created_by=user_id).count(),
'active_matches': Match.query.filter_by(
created_by=user_id, active_status=True
).count(),
'total_uploads': FileUpload.query.filter_by(uploaded_by=user_id).count(),
'completed_uploads': FileUpload.query.filter_by(
uploaded_by=user_id, upload_status='completed'
).count(),
'failed_uploads': FileUpload.query.filter_by(
uploaded_by=user_id, upload_status='failed'
).count(),
'pending_zip_uploads': Match.query.filter_by(
created_by=user_id, zip_upload_status='pending'
).count()
}
# Global statistics (if admin)
global_stats = {}
if user.is_admin:
db_manager = get_db_manager()
global_stats = db_manager.get_database_stats()
file_handler = get_file_upload_handler()
fixture_parser = get_fixture_parser()
upload_stats = file_handler.get_upload_statistics()
parsing_stats = fixture_parser.get_parsing_statistics()
global_stats.update({
'upload_stats': upload_stats,
'parsing_stats': parsing_stats
})
# Recent activity
recent_matches = Match.query.filter_by(created_by=user_id)\
.order_by(Match.created_at.desc()).limit(5).all()
recent_uploads = FileUpload.query.filter_by(uploaded_by=user_id)\
.order_by(FileUpload.created_at.desc()).limit(5).all()
return jsonify({
'user_stats': user_stats,
'global_stats': global_stats,
'recent_activity': {
'matches': [match.to_dict(include_outcomes=False) for match in recent_matches],
'uploads': [upload.to_dict() for upload in recent_uploads]
}
}), 200
except Exception as e:
logger.error(f"API statistics error: {str(e)}")
return jsonify({'error': 'Failed to retrieve statistics'}), 500
@bp.route('/admin/users', methods=['GET'])
@jwt_required()
@require_admin
def api_admin_get_users():
"""Get users list (admin only)"""
try:
page = request.args.get('page', 1, type=int)
per_page = min(request.args.get('per_page', 20, type=int), 100)
search_query = request.args.get('search', '').strip()
status_filter = request.args.get('status')
# Base query
query = User.query
# Apply filters
if status_filter == 'active':
query = query.filter_by(is_active=True)
elif status_filter == 'inactive':
query = query.filter_by(is_active=False)
elif status_filter == 'admin':
query = query.filter_by(is_admin=True)
# Search functionality
if search_query:
search_pattern = f"%{search_query}%"
query = query.filter(
db.or_(
User.username.ilike(search_pattern),
User.email.ilike(search_pattern)
)
)
# Pagination
users_pagination = query.order_by(User.created_at.desc()).paginate(
page=page, per_page=per_page, error_out=False
)
return jsonify({
'users': [user.to_dict() for user in users_pagination.items],
'pagination': {
'page': page,
'pages': users_pagination.pages,
'per_page': per_page,
'total': users_pagination.total,
'has_next': users_pagination.has_next,
'has_prev': users_pagination.has_prev
}
}), 200
except Exception as e:
logger.error(f"API admin users error: {str(e)}")
return jsonify({'error': 'Failed to retrieve users'}), 500
@bp.route('/admin/users/<int:user_id>', methods=['PUT'])
@jwt_required()
@require_admin
def api_admin_update_user(user_id):
"""Update user (admin only)"""
try:
user = User.query.get(user_id)
if not user:
return jsonify({'error': 'User not found'}), 404
data = request.get_json()
if not data:
return jsonify({'error': 'No data provided'}), 400
# Update allowed fields
updatable_fields = ['is_active', 'is_admin']
updated_fields = []
for field in updatable_fields:
if field in data:
setattr(user, field, bool(data[field]))
updated_fields.append(field)
if updated_fields:
user.updated_at = datetime.utcnow()
db.session.commit()
return jsonify({
'message': 'User updated successfully',
'updated_fields': updated_fields,
'user': user.to_dict()
}), 200
else:
return jsonify({'message': 'No fields to update'}), 200
except Exception as e:
logger.error(f"API admin update user error: {str(e)}")
db.session.rollback()
return jsonify({'error': 'Failed to update user'}), 500
@bp.route('/admin/logs', methods=['GET'])
@jwt_required()
@require_admin
def api_admin_get_logs():
"""Get system logs (admin only)"""
try:
page = request.args.get('page', 1, type=int)
per_page = min(request.args.get('per_page', 50, type=int), 200)
level_filter = request.args.get('level')
module_filter = request.args.get('module')
# Base query
query = SystemLog.query
# Apply filters
if level_filter:
query = query.filter_by(level=level_filter)
if module_filter:
query = query.filter_by(module=module_filter)
# Pagination
logs_pagination = query.order_by(SystemLog.created_at.desc()).paginate(
page=page, per_page=per_page, error_out=False
)
return jsonify({
'logs': [log.to_dict() for log in logs_pagination.items],
'pagination': {
'page': page,
'pages': logs_pagination.pages,
'per_page': per_page,
'total': logs_pagination.total,
'has_next': logs_pagination.has_next,
'has_prev': logs_pagination.has_prev
}
}), 200
except Exception as e:
logger.error(f"API admin logs error: {str(e)}")
return jsonify({'error': 'Failed to retrieve logs'}), 500
@bp.route('/admin/system-info', methods=['GET'])
@jwt_required()
@require_admin
def api_admin_system_info():
"""Get system information (admin only)"""
try:
db_manager = get_db_manager()
# Database statistics
db_stats = db_manager.get_database_stats()
# System health
db_healthy = db_manager.test_connection()
# Upload statistics
file_handler = get_file_upload_handler()
upload_stats = file_handler.get_upload_statistics()
# Parsing statistics
fixture_parser = get_fixture_parser()
parsing_stats = fixture_parser.get_parsing_statistics()
# Active sessions
active_sessions = UserSession.query.filter_by(is_active=True).count()
system_info = {
'database': {
'healthy': db_healthy,
'statistics': db_stats
},
'uploads': upload_stats,
'parsing': parsing_stats,
'sessions': {
'active_sessions': active_sessions
},
'timestamp': datetime.utcnow().isoformat()
}
return jsonify(system_info), 200
except Exception as e:
logger.error(f"API system info error: {str(e)}")
return jsonify({'error': 'Failed to retrieve system information'}), 500
@bp.route('/admin/cleanup', methods=['POST'])
@jwt_required()
@require_admin
def api_admin_cleanup():
"""Perform system cleanup (admin only)"""
try:
cleanup_type = request.json.get('type', 'all') if request.json else 'all'
results = {}
if cleanup_type in ['all', 'sessions']:
# Clean up expired sessions
db_manager = get_db_manager()
expired_sessions = db_manager.cleanup_expired_sessions()
results['expired_sessions_cleaned'] = expired_sessions
if cleanup_type in ['all', 'logs']:
# Clean up old logs (older than 30 days)
days = request.json.get('log_retention_days', 30) if request.json else 30
db_manager = get_db_manager()
old_logs = db_manager.cleanup_old_logs(days)
results['old_logs_cleaned'] = old_logs
if cleanup_type in ['all', 'uploads']:
# Clean up failed uploads
file_handler = get_file_upload_handler()
file_handler.cleanup_failed_uploads()
results['failed_uploads_cleaned'] = True
return jsonify({
'message': 'Cleanup completed successfully',
'results': results
}), 200
except Exception as e:
logger.error(f"API cleanup error: {str(e)}")
return jsonify({'error': 'Cleanup failed'}), 500
# Error handlers for API
@bp.errorhandler(404)
def api_not_found(error):
return jsonify({'error': 'Endpoint not found'}), 404
@bp.errorhandler(405)
def api_method_not_allowed(error):
return jsonify({'error': 'Method not allowed'}), 405
@bp.errorhandler(500)
def api_internal_error(error):
return jsonify({'error': 'Internal server error'}), 500
\ No newline at end of file
from flask import Blueprint
bp = Blueprint('auth', __name__)
from app.auth import routes
\ No newline at end of file
from flask_wtf import FlaskForm
from wtforms import StringField, PasswordField, BooleanField, SubmitField
from wtforms.validators import DataRequired, Email, Length, EqualTo, ValidationError
from app.models import User
import re
class LoginForm(FlaskForm):
"""Login form"""
username = StringField('Username', validators=[
DataRequired(message='Username is required'),
Length(min=3, max=80, message='Username must be between 3 and 80 characters')
])
password = PasswordField('Password', validators=[
DataRequired(message='Password is required')
])
remember_me = BooleanField('Remember Me')
submit = SubmitField('Sign In')
class RegistrationForm(FlaskForm):
"""Registration form"""
username = StringField('Username', validators=[
DataRequired(message='Username is required'),
Length(min=3, max=80, message='Username must be between 3 and 80 characters')
])
email = StringField('Email', validators=[
DataRequired(message='Email is required'),
Email(message='Invalid email address'),
Length(max=120, message='Email must be less than 120 characters')
])
password = PasswordField('Password', validators=[
DataRequired(message='Password is required'),
Length(min=8, message='Password must be at least 8 characters long')
])
password2 = PasswordField('Repeat Password', validators=[
DataRequired(message='Please confirm your password'),
EqualTo('password', message='Passwords must match')
])
submit = SubmitField('Register')
def validate_username(self, username):
"""Validate username uniqueness and format"""
# Check for valid characters (alphanumeric and underscore only)
if not re.match(r'^[a-zA-Z0-9_]+$', username.data):
raise ValidationError('Username can only contain letters, numbers, and underscores')
# Check if username already exists
user = User.query.filter_by(username=username.data).first()
if user is not None:
raise ValidationError('Username already exists. Please choose a different one.')
def validate_email(self, email):
"""Validate email uniqueness"""
user = User.query.filter_by(email=email.data).first()
if user is not None:
raise ValidationError('Email already registered. Please use a different email address.')
def validate_password(self, password):
"""Validate password strength"""
password_value = password.data
# Check minimum length
if len(password_value) < 8:
raise ValidationError('Password must be at least 8 characters long')
# Check for at least one uppercase letter
if not re.search(r'[A-Z]', password_value):
raise ValidationError('Password must contain at least one uppercase letter')
# Check for at least one lowercase letter
if not re.search(r'[a-z]', password_value):
raise ValidationError('Password must contain at least one lowercase letter')
# Check for at least one digit
if not re.search(r'\d', password_value):
raise ValidationError('Password must contain at least one number')
# Check for at least one special character
if not re.search(r'[!@#$%^&*(),.?":{}|<>]', password_value):
raise ValidationError('Password must contain at least one special character')
# Check for common weak passwords
weak_passwords = [
'password', '12345678', 'qwerty123', 'admin123',
'password123', '123456789', 'welcome123'
]
if password_value.lower() in weak_passwords:
raise ValidationError('Password is too common. Please choose a stronger password.')
class ChangePasswordForm(FlaskForm):
"""Change password form"""
current_password = PasswordField('Current Password', validators=[
DataRequired(message='Current password is required')
])
new_password = PasswordField('New Password', validators=[
DataRequired(message='New password is required'),
Length(min=8, message='Password must be at least 8 characters long')
])
new_password2 = PasswordField('Repeat New Password', validators=[
DataRequired(message='Please confirm your new password'),
EqualTo('new_password', message='Passwords must match')
])
submit = SubmitField('Change Password')
def validate_new_password(self, new_password):
"""Validate new password strength"""
password_value = new_password.data
# Check minimum length
if len(password_value) < 8:
raise ValidationError('Password must be at least 8 characters long')
# Check for at least one uppercase letter
if not re.search(r'[A-Z]', password_value):
raise ValidationError('Password must contain at least one uppercase letter')
# Check for at least one lowercase letter
if not re.search(r'[a-z]', password_value):
raise ValidationError('Password must contain at least one lowercase letter')
# Check for at least one digit
if not re.search(r'\d', password_value):
raise ValidationError('Password must contain at least one number')
# Check for at least one special character
if not re.search(r'[!@#$%^&*(),.?":{}|<>]', password_value):
raise ValidationError('Password must contain at least one special character')
# Check for common weak passwords
weak_passwords = [
'password', '12345678', 'qwerty123', 'admin123',
'password123', '123456789', 'welcome123'
]
if password_value.lower() in weak_passwords:
raise ValidationError('Password is too common. Please choose a stronger password.')
class ForgotPasswordForm(FlaskForm):
"""Forgot password form"""
email = StringField('Email', validators=[
DataRequired(message='Email is required'),
Email(message='Invalid email address')
])
submit = SubmitField('Reset Password')
def validate_email(self, email):
"""Validate email exists"""
user = User.query.filter_by(email=email.data).first()
if user is None:
raise ValidationError('No account found with that email address.')
class ResetPasswordForm(FlaskForm):
"""Reset password form"""
password = PasswordField('New Password', validators=[
DataRequired(message='Password is required'),
Length(min=8, message='Password must be at least 8 characters long')
])
password2 = PasswordField('Repeat Password', validators=[
DataRequired(message='Please confirm your password'),
EqualTo('password', message='Passwords must match')
])
submit = SubmitField('Reset Password')
def validate_password(self, password):
"""Validate password strength"""
password_value = password.data
# Check minimum length
if len(password_value) < 8:
raise ValidationError('Password must be at least 8 characters long')
# Check for at least one uppercase letter
if not re.search(r'[A-Z]', password_value):
raise ValidationError('Password must contain at least one uppercase letter')
# Check for at least one lowercase letter
if not re.search(r'[a-z]', password_value):
raise ValidationError('Password must contain at least one lowercase letter')
# Check for at least one digit
if not re.search(r'\d', password_value):
raise ValidationError('Password must contain at least one number')
# Check for at least one special character
if not re.search(r'[!@#$%^&*(),.?":{}|<>]', password_value):
raise ValidationError('Password must contain at least one special character')
# Check for common weak passwords
weak_passwords = [
'password', '12345678', 'qwerty123', 'admin123',
'password123', '123456789', 'welcome123'
]
if password_value.lower() in weak_passwords:
raise ValidationError('Password is too common. Please choose a stronger password.')
\ No newline at end of file
import logging
from datetime import datetime, timedelta
from flask import request, jsonify, render_template, redirect, url_for, flash, session, current_app
from flask_login import login_user, logout_user, login_required, current_user
from flask_jwt_extended import create_access_token, jwt_required, get_jwt_identity, get_jwt
from werkzeug.security import check_password_hash
from app.auth import bp
from app import db
from app.models import User, UserSession, SystemLog
from app.auth.forms import LoginForm, RegistrationForm
from app.utils.security import validate_password_strength, generate_secure_token, rate_limit_check
from app.utils.logging import log_security_event
logger = logging.getLogger(__name__)
@bp.route('/login', methods=['GET', 'POST'])
def login():
"""User login endpoint"""
if current_user.is_authenticated:
return redirect(url_for('main.dashboard'))
form = LoginForm()
if form.validate_on_submit():
# Rate limiting check
client_ip = request.environ.get('HTTP_X_REAL_IP', request.remote_addr)
if not rate_limit_check(client_ip, 'login', max_attempts=5, window_minutes=15):
log_security_event('LOGIN_RATE_LIMIT', client_ip, username=form.username.data)
flash('Too many login attempts. Please try again later.', 'error')
return render_template('auth/login.html', form=form)
user = User.query.filter_by(username=form.username.data).first()
if user and user.check_password(form.password.data):
if not user.is_active:
log_security_event('LOGIN_INACTIVE_USER', client_ip, user_id=user.id)
flash('Your account has been deactivated.', 'error')
return render_template('auth/login.html', form=form)
# Update last login
user.update_last_login()
# Create user session
user_session = UserSession(
user_id=user.id,
ip_address=client_ip,
user_agent=request.headers.get('User-Agent', '')
)
db.session.add(user_session)
db.session.commit()
# Login user
login_user(user, remember=form.remember_me.data)
session['session_id'] = user_session.session_id
log_security_event('LOGIN_SUCCESS', client_ip, user_id=user.id)
# Redirect to next page or dashboard
next_page = request.args.get('next')
if not next_page or not next_page.startswith('/'):
next_page = url_for('main.dashboard')
return redirect(next_page)
else:
log_security_event('LOGIN_FAILED', client_ip, username=form.username.data)
flash('Invalid username or password', 'error')
return render_template('auth/login.html', form=form)
@bp.route('/logout')
@login_required
def logout():
"""User logout endpoint"""
client_ip = request.environ.get('HTTP_X_REAL_IP', request.remote_addr)
# Deactivate user session
if 'session_id' in session:
user_session = UserSession.query.filter_by(session_id=session['session_id']).first()
if user_session:
user_session.deactivate()
log_security_event('LOGOUT', client_ip, user_id=current_user.id)
logout_user()
session.clear()
flash('You have been logged out successfully.', 'info')
return redirect(url_for('auth.login'))
@bp.route('/register', methods=['GET', 'POST'])
def register():
"""User registration endpoint (admin only in production)"""
if current_app.config.get('REGISTRATION_DISABLED', False):
flash('Registration is disabled.', 'error')
return redirect(url_for('auth.login'))
if current_user.is_authenticated:
return redirect(url_for('main.dashboard'))
form = RegistrationForm()
if form.validate_on_submit():
client_ip = request.environ.get('HTTP_X_REAL_IP', request.remote_addr)
# Rate limiting check
if not rate_limit_check(client_ip, 'register', max_attempts=3, window_minutes=60):
log_security_event('REGISTER_RATE_LIMIT', client_ip, username=form.username.data)
flash('Too many registration attempts. Please try again later.', 'error')
return render_template('auth/register.html', form=form)
# Check if user already exists
if User.query.filter_by(username=form.username.data).first():
flash('Username already exists.', 'error')
return render_template('auth/register.html', form=form)
if User.query.filter_by(email=form.email.data).first():
flash('Email already registered.', 'error')
return render_template('auth/register.html', form=form)
# Validate password strength
if not validate_password_strength(form.password.data):
flash('Password does not meet security requirements.', 'error')
return render_template('auth/register.html', form=form)
# Create new user
user = User(
username=form.username.data,
email=form.email.data,
is_active=True
)
user.set_password(form.password.data)
db.session.add(user)
db.session.commit()
log_security_event('USER_REGISTERED', client_ip, user_id=user.id)
flash('Registration successful! You can now log in.', 'success')
return redirect(url_for('auth.login'))
return render_template('auth/register.html', form=form)
@bp.route('/api/login', methods=['POST'])
def api_login():
"""API login endpoint for JWT authentication"""
try:
data = request.get_json()
if not data or not data.get('username') or not data.get('password'):
return jsonify({'error': 'Username and password required'}), 400
client_ip = request.environ.get('HTTP_X_REAL_IP', request.remote_addr)
# Rate limiting check
if not rate_limit_check(client_ip, 'api_login', max_attempts=10, window_minutes=15):
log_security_event('API_LOGIN_RATE_LIMIT', client_ip, username=data.get('username'))
return jsonify({'error': 'Too many login attempts'}), 429
user = User.query.filter_by(username=data['username']).first()
if user and user.check_password(data['password']):
if not user.is_active:
log_security_event('API_LOGIN_INACTIVE_USER', client_ip, user_id=user.id)
return jsonify({'error': 'Account deactivated'}), 403
# Update last login
user.update_last_login()
# Create access token
access_token = create_access_token(
identity=user.id,
expires_delta=timedelta(hours=current_app.config['JWT_ACCESS_TOKEN_EXPIRES'] // 3600)
)
# Create user session
user_session = UserSession(
user_id=user.id,
ip_address=client_ip,
user_agent=request.headers.get('User-Agent', '')
)
db.session.add(user_session)
db.session.commit()
log_security_event('API_LOGIN_SUCCESS', client_ip, user_id=user.id)
return jsonify({
'access_token': access_token,
'user': user.to_dict(),
'session_id': user_session.session_id
}), 200
else:
log_security_event('API_LOGIN_FAILED', client_ip, username=data.get('username'))
return jsonify({'error': 'Invalid credentials'}), 401
except Exception as e:
logger.error(f"API login error: {str(e)}")
return jsonify({'error': 'Internal server error'}), 500
@bp.route('/api/logout', methods=['POST'])
@jwt_required()
def api_logout():
"""API logout endpoint"""
try:
user_id = get_jwt_identity()
client_ip = request.environ.get('HTTP_X_REAL_IP', request.remote_addr)
# Get session ID from request
data = request.get_json() or {}
session_id = data.get('session_id')
if session_id:
user_session = UserSession.query.filter_by(
session_id=session_id,
user_id=user_id
).first()
if user_session:
user_session.deactivate()
log_security_event('API_LOGOUT', client_ip, user_id=user_id)
return jsonify({'message': 'Logged out successfully'}), 200
except Exception as e:
logger.error(f"API logout error: {str(e)}")
return jsonify({'error': 'Internal server error'}), 500
@bp.route('/api/refresh', methods=['POST'])
@jwt_required()
def api_refresh():
"""Refresh JWT token"""
try:
user_id = get_jwt_identity()
user = User.query.get(user_id)
if not user or not user.is_active:
return jsonify({'error': 'User not found or inactive'}), 404
# Create new access token
access_token = create_access_token(
identity=user.id,
expires_delta=timedelta(hours=current_app.config['JWT_ACCESS_TOKEN_EXPIRES'] // 3600)
)
return jsonify({
'access_token': access_token,
'user': user.to_dict()
}), 200
except Exception as e:
logger.error(f"Token refresh error: {str(e)}")
return jsonify({'error': 'Internal server error'}), 500
@bp.route('/api/profile', methods=['GET'])
@jwt_required()
def api_profile():
"""Get user profile"""
try:
user_id = get_jwt_identity()
user = User.query.get(user_id)
if not user:
return jsonify({'error': 'User not found'}), 404
return jsonify({'user': user.to_dict()}), 200
except Exception as e:
logger.error(f"Profile fetch error: {str(e)}")
return jsonify({'error': 'Internal server error'}), 500
@bp.route('/api/change-password', methods=['POST'])
@jwt_required()
def api_change_password():
"""Change user password"""
try:
user_id = get_jwt_identity()
user = User.query.get(user_id)
if not user:
return jsonify({'error': 'User not found'}), 404
data = request.get_json()
if not data or not data.get('current_password') or not data.get('new_password'):
return jsonify({'error': 'Current and new password required'}), 400
# Verify current password
if not user.check_password(data['current_password']):
client_ip = request.environ.get('HTTP_X_REAL_IP', request.remote_addr)
log_security_event('PASSWORD_CHANGE_FAILED', client_ip, user_id=user.id)
return jsonify({'error': 'Current password incorrect'}), 400
# Validate new password strength
if not validate_password_strength(data['new_password']):
return jsonify({'error': 'New password does not meet security requirements'}), 400
# Update password
user.set_password(data['new_password'])
user.updated_at = datetime.utcnow()
db.session.commit()
client_ip = request.environ.get('HTTP_X_REAL_IP', request.remote_addr)
log_security_event('PASSWORD_CHANGED', client_ip, user_id=user.id)
return jsonify({'message': 'Password changed successfully'}), 200
except Exception as e:
logger.error(f"Password change error: {str(e)}")
return jsonify({'error': 'Internal server error'}), 500
@bp.route('/api/sessions', methods=['GET'])
@jwt_required()
def api_user_sessions():
"""Get user's active sessions"""
try:
user_id = get_jwt_identity()
sessions = UserSession.query.filter_by(
user_id=user_id,
is_active=True
).order_by(UserSession.last_activity.desc()).all()
return jsonify({
'sessions': [session.to_dict() for session in sessions]
}), 200
except Exception as e:
logger.error(f"Sessions fetch error: {str(e)}")
return jsonify({'error': 'Internal server error'}), 500
@bp.route('/api/sessions/<session_id>', methods=['DELETE'])
@jwt_required()
def api_terminate_session(session_id):
"""Terminate a specific session"""
try:
user_id = get_jwt_identity()
user_session = UserSession.query.filter_by(
session_id=session_id,
user_id=user_id
).first()
if not user_session:
return jsonify({'error': 'Session not found'}), 404
user_session.deactivate()
client_ip = request.environ.get('HTTP_X_REAL_IP', request.remote_addr)
log_security_event('SESSION_TERMINATED', client_ip, user_id=user_id,
extra_data={'terminated_session_id': session_id})
return jsonify({'message': 'Session terminated successfully'}), 200
except Exception as e:
logger.error(f"Session termination error: {str(e)}")
return jsonify({'error': 'Internal server error'}), 500
\ No newline at end of file
import os
import logging
from contextlib import contextmanager
from sqlalchemy import create_engine, text
from sqlalchemy.exc import SQLAlchemyError, OperationalError
from sqlalchemy.pool import QueuePool
from flask import current_app
from app import db
from app.models import User, Match, MatchOutcome, FileUpload, SystemLog, UserSession
logger = logging.getLogger(__name__)
class DatabaseManager:
"""Database connection and management utilities"""
def __init__(self, app=None):
self.app = app
self.engine = None
if app is not None:
self.init_app(app)
def init_app(self, app):
"""Initialize database manager with Flask app"""
self.app = app
self.engine = create_engine(
app.config['SQLALCHEMY_DATABASE_URI'],
poolclass=QueuePool,
pool_size=10,
max_overflow=20,
pool_pre_ping=True,
pool_recycle=3600,
echo=app.config.get('SQLALCHEMY_ECHO', False)
)
@contextmanager
def get_connection(self):
"""Get database connection with automatic cleanup"""
connection = None
try:
connection = self.engine.connect()
yield connection
except Exception as e:
if connection:
connection.rollback()
logger.error(f"Database connection error: {str(e)}")
raise
finally:
if connection:
connection.close()
def test_connection(self):
"""Test database connectivity"""
try:
with self.get_connection() as conn:
result = conn.execute(text("SELECT 1"))
return result.fetchone()[0] == 1
except Exception as e:
logger.error(f"Database connection test failed: {str(e)}")
return False
def create_database_if_not_exists(self):
"""Create database if it doesn't exist"""
try:
# Extract database name from URI
db_uri = self.app.config['SQLALCHEMY_DATABASE_URI']
db_name = db_uri.split('/')[-1].split('?')[0]
# Create connection without database name
base_uri = db_uri.rsplit('/', 1)[0]
temp_engine = create_engine(base_uri)
with temp_engine.connect() as conn:
# Check if database exists
result = conn.execute(text(f"SHOW DATABASES LIKE '{db_name}'"))
if not result.fetchone():
conn.execute(text(f"CREATE DATABASE {db_name} CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci"))
logger.info(f"Created database: {db_name}")
else:
logger.info(f"Database {db_name} already exists")
temp_engine.dispose()
return True
except Exception as e:
logger.error(f"Failed to create database: {str(e)}")
return False
def execute_schema_file(self, schema_file_path):
"""Execute SQL schema file"""
try:
if not os.path.exists(schema_file_path):
logger.error(f"Schema file not found: {schema_file_path}")
return False
with open(schema_file_path, 'r') as f:
schema_sql = f.read()
# Split by semicolon and execute each statement
statements = [stmt.strip() for stmt in schema_sql.split(';') if stmt.strip()]
with self.get_connection() as conn:
for statement in statements:
if statement and not statement.startswith('--'):
try:
conn.execute(text(statement))
except Exception as e:
# Log but continue with other statements
logger.warning(f"Statement execution warning: {str(e)}")
conn.commit()
logger.info(f"Schema file executed successfully: {schema_file_path}")
return True
except Exception as e:
logger.error(f"Failed to execute schema file: {str(e)}")
return False
def initialize_database(self, schema_file_path=None):
"""Initialize database with schema and default data"""
try:
# Create database if it doesn't exist
if not self.create_database_if_not_exists():
return False
# Test connection
if not self.test_connection():
logger.error("Database connection test failed")
return False
# Execute schema file if provided
if schema_file_path and os.path.exists(schema_file_path):
if not self.execute_schema_file(schema_file_path):
logger.warning("Schema file execution failed, trying SQLAlchemy create_all")
# Create tables using SQLAlchemy
with self.app.app_context():
db.create_all()
logger.info("Database tables created successfully")
# Create default admin user if it doesn't exist
self.create_default_admin()
return True
except Exception as e:
logger.error(f"Database initialization failed: {str(e)}")
return False
def create_default_admin(self):
"""Create default admin user if it doesn't exist"""
try:
with self.app.app_context():
admin_user = User.query.filter_by(username='admin').first()
if not admin_user:
admin_user = User(
username='admin',
email='admin@fixture-daemon.local',
is_admin=True,
is_active=True
)
admin_user.set_password('admin123') # Change in production!
db.session.add(admin_user)
db.session.commit()
logger.info("Default admin user created (username: admin, password: admin123)")
else:
logger.info("Default admin user already exists")
except Exception as e:
logger.error(f"Failed to create default admin user: {str(e)}")
def get_database_stats(self):
"""Get database statistics"""
try:
with self.app.app_context():
stats = {
'users': User.query.count(),
'matches': Match.query.count(),
'active_matches': Match.query.filter_by(active_status=True).count(),
'match_outcomes': MatchOutcome.query.count(),
'file_uploads': FileUpload.query.count(),
'system_logs': SystemLog.query.count(),
'user_sessions': UserSession.query.filter_by(is_active=True).count()
}
return stats
except Exception as e:
logger.error(f"Failed to get database stats: {str(e)}")
return {}
def cleanup_expired_sessions(self):
"""Clean up expired user sessions"""
try:
with self.app.app_context():
from datetime import datetime
expired_sessions = UserSession.query.filter(
UserSession.expires_at < datetime.utcnow()
).all()
count = len(expired_sessions)
for session in expired_sessions:
db.session.delete(session)
db.session.commit()
logger.info(f"Cleaned up {count} expired sessions")
return count
except Exception as e:
logger.error(f"Failed to cleanup expired sessions: {str(e)}")
return 0
def cleanup_old_logs(self, days=30):
"""Clean up old system logs"""
try:
with self.app.app_context():
from datetime import datetime, timedelta
cutoff_date = datetime.utcnow() - timedelta(days=days)
old_logs = SystemLog.query.filter(
SystemLog.created_at < cutoff_date
).all()
count = len(old_logs)
for log in old_logs:
db.session.delete(log)
db.session.commit()
logger.info(f"Cleaned up {count} old log entries")
return count
except Exception as e:
logger.error(f"Failed to cleanup old logs: {str(e)}")
return 0
def backup_database(self, backup_path):
"""Create database backup using mysqldump"""
try:
import subprocess
# Extract connection details
db_uri = self.app.config['SQLALCHEMY_DATABASE_URI']
# Parse mysql+pymysql://user:pass@host:port/database
uri_parts = db_uri.replace('mysql+pymysql://', '').split('/')
db_name = uri_parts[-1]
auth_host = uri_parts[0]
if '@' in auth_host:
auth, host_port = auth_host.split('@')
if ':' in auth:
user, password = auth.split(':', 1)
else:
user, password = auth, ''
else:
user, password = 'root', ''
host_port = auth_host
if ':' in host_port:
host, port = host_port.split(':')
else:
host, port = host_port, '3306'
# Build mysqldump command
cmd = [
'mysqldump',
f'--host={host}',
f'--port={port}',
f'--user={user}',
'--single-transaction',
'--routines',
'--triggers',
db_name
]
if password:
cmd.append(f'--password={password}')
# Execute backup
with open(backup_path, 'w') as backup_file:
result = subprocess.run(cmd, stdout=backup_file, stderr=subprocess.PIPE, text=True)
if result.returncode == 0:
logger.info(f"Database backup created: {backup_path}")
return True
else:
logger.error(f"Database backup failed: {result.stderr}")
return False
except Exception as e:
logger.error(f"Database backup failed: {str(e)}")
return False
# Global database manager instance
db_manager = DatabaseManager()
def init_database_manager(app):
"""Initialize database manager with Flask app"""
db_manager.init_app(app)
return db_manager
def get_db_manager():
"""Get database manager instance"""
return db_manager
\ No newline at end of file
from flask import Blueprint
bp = Blueprint('main', __name__)
from app.main import routes
\ No newline at end of file
import logging
from flask import render_template, request, jsonify, redirect, url_for, flash, current_app
from flask_login import login_required, current_user
from app.main import bp
from app import db
from app.models import Match, FileUpload, User, SystemLog, MatchOutcome
from app.upload.file_handler import get_file_upload_handler
from app.upload.fixture_parser import get_fixture_parser
from app.utils.security import require_admin, require_active_user
from app.database import get_db_manager
logger = logging.getLogger(__name__)
@bp.route('/')
def index():
"""Home page"""
if current_user.is_authenticated:
return redirect(url_for('main.dashboard'))
return render_template('main/index.html')
@bp.route('/dashboard')
@login_required
@require_active_user
def dashboard():
"""Main dashboard"""
try:
# Get user statistics
user_matches = Match.query.filter_by(created_by=current_user.id).count()
user_uploads = FileUpload.query.filter_by(uploaded_by=current_user.id).count()
# Get recent matches
recent_matches = Match.query.filter_by(created_by=current_user.id)\
.order_by(Match.created_at.desc()).limit(5).all()
# Get recent uploads
recent_uploads = FileUpload.query.filter_by(uploaded_by=current_user.id)\
.order_by(FileUpload.created_at.desc()).limit(5).all()
# Get system statistics (for admins)
system_stats = {}
if current_user.is_admin:
db_manager = get_db_manager()
system_stats = db_manager.get_database_stats()
file_handler = get_file_upload_handler()
upload_stats = file_handler.get_upload_statistics()
system_stats.update(upload_stats)
return render_template('main/dashboard.html',
user_matches=user_matches,
user_uploads=user_uploads,
recent_matches=recent_matches,
recent_uploads=recent_uploads,
system_stats=system_stats)
except Exception as e:
logger.error(f"Dashboard error: {str(e)}")
flash('Error loading dashboard', 'error')
return render_template('main/dashboard.html')
@bp.route('/matches')
@login_required
@require_active_user
def matches():
"""List matches with pagination and filtering"""
try:
page = request.args.get('page', 1, type=int)
per_page = min(request.args.get('per_page', 20, type=int), 100)
# Filtering options
status_filter = request.args.get('status')
search_query = request.args.get('search', '').strip()
# Base query
if current_user.is_admin:
query = Match.query
else:
query = Match.query.filter_by(created_by=current_user.id)
# Apply filters
if status_filter == 'active':
query = query.filter_by(active_status=True)
elif status_filter == 'pending':
query = query.filter_by(active_status=False)
elif status_filter == 'zip_pending':
query = query.filter_by(zip_upload_status='pending')
elif status_filter == 'zip_completed':
query = query.filter_by(zip_upload_status='completed')
# Search functionality
if search_query:
search_pattern = f"%{search_query}%"
query = query.filter(
db.or_(
Match.fighter1_township.ilike(search_pattern),
Match.fighter2_township.ilike(search_pattern),
Match.venue_kampala_township.ilike(search_pattern),
Match.match_number.like(search_pattern)
)
)
# Pagination
matches_pagination = query.order_by(Match.created_at.desc()).paginate(
page=page, per_page=per_page, error_out=False
)
return render_template('main/matches.html',
matches=matches_pagination.items,
pagination=matches_pagination,
status_filter=status_filter,
search_query=search_query)
except Exception as e:
logger.error(f"Matches list error: {str(e)}")
flash('Error loading matches', 'error')
return render_template('main/matches.html', matches=[], pagination=None)
@bp.route('/match/<int:id>')
@login_required
@require_active_user
def match_detail(id):
"""Match detail page"""
try:
if current_user.is_admin:
match = Match.query.get_or_404(id)
else:
match = Match.query.filter_by(id=id, created_by=current_user.id).first_or_404()
# Get match outcomes
outcomes = match.outcomes.all()
# Get associated uploads
uploads = FileUpload.query.filter_by(match_id=id).all()
return render_template('main/match_detail.html',
match=match,
outcomes=outcomes,
uploads=uploads)
except Exception as e:
logger.error(f"Match detail error: {str(e)}")
flash('Error loading match details', 'error')
return redirect(url_for('main.matches'))
@bp.route('/uploads')
@login_required
@require_active_user
def uploads():
"""List uploads with pagination and filtering"""
try:
page = request.args.get('page', 1, type=int)
per_page = min(request.args.get('per_page', 20, type=int), 100)
# Filtering options
file_type_filter = request.args.get('file_type')
status_filter = request.args.get('status')
# Base query
if current_user.is_admin:
query = FileUpload.query
else:
query = FileUpload.query.filter_by(uploaded_by=current_user.id)
# Apply filters
if file_type_filter:
query = query.filter_by(file_type=file_type_filter)
if status_filter:
query = query.filter_by(upload_status=status_filter)
# Pagination
uploads_pagination = query.order_by(FileUpload.created_at.desc()).paginate(
page=page, per_page=per_page, error_out=False
)
return render_template('main/uploads.html',
uploads=uploads_pagination.items,
pagination=uploads_pagination,
file_type_filter=file_type_filter,
status_filter=status_filter)
except Exception as e:
logger.error(f"Uploads list error: {str(e)}")
flash('Error loading uploads', 'error')
return render_template('main/uploads.html', uploads=[], pagination=None)
@bp.route('/statistics')
@login_required
@require_active_user
def statistics():
"""Statistics page"""
try:
# Get file handler and parser statistics
file_handler = get_file_upload_handler()
fixture_parser = get_fixture_parser()
upload_stats = file_handler.get_upload_statistics()
parsing_stats = fixture_parser.get_parsing_statistics()
# User-specific statistics
user_stats = {
'total_uploads': FileUpload.query.filter_by(uploaded_by=current_user.id).count(),
'total_matches': Match.query.filter_by(created_by=current_user.id).count(),
'active_matches': Match.query.filter_by(
created_by=current_user.id, active_status=True
).count(),
'pending_zip_uploads': Match.query.filter_by(
created_by=current_user.id, zip_upload_status='pending'
).count()
}
# System statistics (admin only)
system_stats = {}
if current_user.is_admin:
db_manager = get_db_manager()
system_stats = db_manager.get_database_stats()
return render_template('main/statistics.html',
upload_stats=upload_stats,
parsing_stats=parsing_stats,
user_stats=user_stats,
system_stats=system_stats)
except Exception as e:
logger.error(f"Statistics error: {str(e)}")
flash('Error loading statistics', 'error')
return render_template('main/statistics.html')
@bp.route('/admin')
@login_required
@require_admin
def admin_panel():
"""Admin panel"""
try:
# Get system overview
db_manager = get_db_manager()
system_stats = db_manager.get_database_stats()
# Get recent system logs
recent_logs = SystemLog.query.order_by(SystemLog.created_at.desc()).limit(20).all()
# Get user statistics
total_users = User.query.count()
active_users = User.query.filter_by(is_active=True).count()
admin_users = User.query.filter_by(is_admin=True).count()
user_stats = {
'total_users': total_users,
'active_users': active_users,
'admin_users': admin_users,
'inactive_users': total_users - active_users
}
# Get upload statistics
file_handler = get_file_upload_handler()
upload_stats = file_handler.get_upload_statistics()
return render_template('main/admin.html',
system_stats=system_stats,
user_stats=user_stats,
upload_stats=upload_stats,
recent_logs=recent_logs)
except Exception as e:
logger.error(f"Admin panel error: {str(e)}")
flash('Error loading admin panel', 'error')
return render_template('main/admin.html')
@bp.route('/admin/users')
@login_required
@require_admin
def admin_users():
"""Admin user management"""
try:
page = request.args.get('page', 1, type=int)
per_page = min(request.args.get('per_page', 20, type=int), 100)
search_query = request.args.get('search', '').strip()
status_filter = request.args.get('status')
# Base query
query = User.query
# Apply filters
if status_filter == 'active':
query = query.filter_by(is_active=True)
elif status_filter == 'inactive':
query = query.filter_by(is_active=False)
elif status_filter == 'admin':
query = query.filter_by(is_admin=True)
# Search functionality
if search_query:
search_pattern = f"%{search_query}%"
query = query.filter(
db.or_(
User.username.ilike(search_pattern),
User.email.ilike(search_pattern)
)
)
# Pagination
users_pagination = query.order_by(User.created_at.desc()).paginate(
page=page, per_page=per_page, error_out=False
)
return render_template('main/admin_users.html',
users=users_pagination.items,
pagination=users_pagination,
search_query=search_query,
status_filter=status_filter)
except Exception as e:
logger.error(f"Admin users error: {str(e)}")
flash('Error loading users', 'error')
return render_template('main/admin_users.html', users=[], pagination=None)
@bp.route('/admin/logs')
@login_required
@require_admin
def admin_logs():
"""Admin system logs"""
try:
page = request.args.get('page', 1, type=int)
per_page = min(request.args.get('per_page', 50, type=int), 200)
level_filter = request.args.get('level')
module_filter = request.args.get('module')
# Base query
query = SystemLog.query
# Apply filters
if level_filter:
query = query.filter_by(level=level_filter)
if module_filter:
query = query.filter_by(module=module_filter)
# Pagination
logs_pagination = query.order_by(SystemLog.created_at.desc()).paginate(
page=page, per_page=per_page, error_out=False
)
# Get available modules for filter
modules = db.session.query(SystemLog.module).distinct().all()
modules = [m[0] for m in modules if m[0]]
return render_template('main/admin_logs.html',
logs=logs_pagination.items,
pagination=logs_pagination,
level_filter=level_filter,
module_filter=module_filter,
modules=modules)
except Exception as e:
logger.error(f"Admin logs error: {str(e)}")
flash('Error loading logs', 'error')
return render_template('main/admin_logs.html', logs=[], pagination=None)
@bp.route('/health')
def health_check():
"""Health check endpoint"""
try:
# Test database connection
db_manager = get_db_manager()
db_healthy = db_manager.test_connection()
# Get basic statistics
stats = db_manager.get_database_stats()
health_status = {
'status': 'healthy' if db_healthy else 'unhealthy',
'database': 'connected' if db_healthy else 'disconnected',
'timestamp': db.func.now(),
'statistics': stats
}
status_code = 200 if db_healthy else 503
return jsonify(health_status), status_code
except Exception as e:
logger.error(f"Health check error: {str(e)}")
return jsonify({
'status': 'unhealthy',
'error': str(e),
'timestamp': db.func.now()
}), 503
@bp.route('/api/dashboard-data')
@login_required
@require_active_user
def api_dashboard_data():
"""API endpoint for dashboard data"""
try:
# User statistics
user_stats = {
'total_matches': Match.query.filter_by(created_by=current_user.id).count(),
'active_matches': Match.query.filter_by(
created_by=current_user.id, active_status=True
).count(),
'total_uploads': FileUpload.query.filter_by(uploaded_by=current_user.id).count(),
'pending_zip_uploads': Match.query.filter_by(
created_by=current_user.id, zip_upload_status='pending'
).count()
}
# Recent activity
recent_matches = Match.query.filter_by(created_by=current_user.id)\
.order_by(Match.created_at.desc()).limit(5).all()
recent_uploads = FileUpload.query.filter_by(uploaded_by=current_user.id)\
.order_by(FileUpload.created_at.desc()).limit(5).all()
return jsonify({
'user_stats': user_stats,
'recent_matches': [match.to_dict() for match in recent_matches],
'recent_uploads': [upload.to_dict() for upload in recent_uploads]
}), 200
except Exception as e:
logger.error(f"Dashboard API error: {str(e)}")
return jsonify({'error': 'Failed to load dashboard data'}), 500
\ No newline at end of file
from datetime import datetime, timedelta
from flask_sqlalchemy import SQLAlchemy
from flask_login import UserMixin
from werkzeug.security import generate_password_hash, check_password_hash
from app import db
import uuid
import hashlib
import json
class User(UserMixin, db.Model):
"""User model for authentication"""
__tablename__ = 'users'
id = db.Column(db.Integer, primary_key=True)
username = db.Column(db.String(80), unique=True, nullable=False, index=True)
email = db.Column(db.String(120), unique=True, nullable=False, index=True)
password_hash = db.Column(db.String(255), nullable=False)
is_active = db.Column(db.Boolean, default=True, index=True)
is_admin = db.Column(db.Boolean, default=False)
created_at = db.Column(db.DateTime, default=datetime.utcnow)
updated_at = db.Column(db.DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
last_login = db.Column(db.DateTime)
# Relationships
matches = db.relationship('Match', backref='creator', lazy='dynamic', foreign_keys='Match.created_by')
uploads = db.relationship('FileUpload', backref='uploader', lazy='dynamic', foreign_keys='FileUpload.uploaded_by')
sessions = db.relationship('UserSession', backref='user', lazy='dynamic', cascade='all, delete-orphan')
logs = db.relationship('SystemLog', backref='user', lazy='dynamic', foreign_keys='SystemLog.user_id')
def set_password(self, password):
"""Set password hash"""
self.password_hash = generate_password_hash(password)
def check_password(self, password):
"""Check password against hash"""
return check_password_hash(self.password_hash, password)
def update_last_login(self):
"""Update last login timestamp"""
self.last_login = datetime.utcnow()
db.session.commit()
def to_dict(self):
"""Convert to dictionary for JSON serialization"""
return {
'id': self.id,
'username': self.username,
'email': self.email,
'is_active': self.is_active,
'is_admin': self.is_admin,
'created_at': self.created_at.isoformat() if self.created_at else None,
'last_login': self.last_login.isoformat() if self.last_login else None
}
def __repr__(self):
return f'<User {self.username}>'
class Match(db.Model):
"""Primary matches table storing core fixture data"""
__tablename__ = 'matches'
id = db.Column(db.Integer, primary_key=True)
match_number = db.Column(db.Integer, unique=True, nullable=False, index=True)
fighter1_township = db.Column(db.String(255), nullable=False)
fighter2_township = db.Column(db.String(255), nullable=False)
venue_kampala_township = db.Column(db.String(255), nullable=False)
# System fields
start_time = db.Column(db.DateTime)
end_time = db.Column(db.DateTime)
result = db.Column(db.String(255))
filename = db.Column(db.String(1024), nullable=False)
file_sha1sum = db.Column(db.String(255), nullable=False, index=True)
fixture_id = db.Column(db.String(255), unique=True, nullable=False, index=True)
active_status = db.Column(db.Boolean, default=False, index=True)
# ZIP file related fields
zip_filename = db.Column(db.String(1024))
zip_sha1sum = db.Column(db.String(255), index=True)
zip_upload_status = db.Column(db.Enum('pending', 'uploading', 'completed', 'failed', name='zip_upload_status'),
default='pending', index=True)
zip_upload_progress = db.Column(db.Numeric(5, 2), default=0.00)
# Metadata
created_by = db.Column(db.Integer, db.ForeignKey('users.id'), index=True)
created_at = db.Column(db.DateTime, default=datetime.utcnow)
updated_at = db.Column(db.DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
# Relationships
outcomes = db.relationship('MatchOutcome', backref='match', lazy='dynamic', cascade='all, delete-orphan')
uploads = db.relationship('FileUpload', backref='match', lazy='dynamic', foreign_keys='FileUpload.match_id')
logs = db.relationship('SystemLog', backref='match', lazy='dynamic', foreign_keys='SystemLog.match_id')
def __init__(self, **kwargs):
super(Match, self).__init__(**kwargs)
if not self.fixture_id:
self.fixture_id = str(uuid.uuid4())
def calculate_file_sha1(self, file_path):
"""Calculate SHA1 checksum of a file"""
sha1_hash = hashlib.sha1()
with open(file_path, 'rb') as f:
for chunk in iter(lambda: f.read(4096), b""):
sha1_hash.update(chunk)
return sha1_hash.hexdigest()
def set_active(self):
"""Set match as active (both fixture and ZIP uploaded successfully)"""
if self.zip_upload_status == 'completed' and self.zip_sha1sum:
self.active_status = True
db.session.commit()
return True
return False
def add_outcome(self, column_name, float_value):
"""Add an outcome to this match"""
outcome = MatchOutcome(
match_id=self.id,
column_name=column_name,
float_value=float_value
)
db.session.add(outcome)
return outcome
def get_outcomes_dict(self):
"""Get outcomes as dictionary"""
return {outcome.column_name: float(outcome.float_value) for outcome in self.outcomes}
def to_dict(self, include_outcomes=True):
"""Convert to dictionary for JSON serialization"""
data = {
'id': self.id,
'match_number': self.match_number,
'fighter1_township': self.fighter1_township,
'fighter2_township': self.fighter2_township,
'venue_kampala_township': self.venue_kampala_township,
'start_time': self.start_time.isoformat() if self.start_time else None,
'end_time': self.end_time.isoformat() if self.end_time else None,
'result': self.result,
'filename': self.filename,
'file_sha1sum': self.file_sha1sum,
'fixture_id': self.fixture_id,
'active_status': self.active_status,
'zip_filename': self.zip_filename,
'zip_sha1sum': self.zip_sha1sum,
'zip_upload_status': self.zip_upload_status,
'zip_upload_progress': float(self.zip_upload_progress) if self.zip_upload_progress else 0.0,
'created_at': self.created_at.isoformat() if self.created_at else None,
'updated_at': self.updated_at.isoformat() if self.updated_at else None
}
if include_outcomes:
data['outcomes'] = self.get_outcomes_dict()
return data
def __repr__(self):
return f'<Match {self.match_number}: {self.fighter1_township} vs {self.fighter2_township}>'
class MatchOutcome(db.Model):
"""Secondary outcomes table with foreign key relationships"""
__tablename__ = 'match_outcomes'
id = db.Column(db.Integer, primary_key=True)
match_id = db.Column(db.Integer, db.ForeignKey('matches.id'), nullable=False, index=True)
column_name = db.Column(db.String(255), nullable=False, index=True)
float_value = db.Column(db.Numeric(10, 2), nullable=False, index=True)
created_at = db.Column(db.DateTime, default=datetime.utcnow)
updated_at = db.Column(db.DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
__table_args__ = (
db.UniqueConstraint('match_id', 'column_name', name='unique_match_column'),
)
def to_dict(self):
"""Convert to dictionary for JSON serialization"""
return {
'id': self.id,
'match_id': self.match_id,
'column_name': self.column_name,
'float_value': float(self.float_value),
'created_at': self.created_at.isoformat() if self.created_at else None
}
def __repr__(self):
return f'<MatchOutcome {self.column_name}: {self.float_value}>'
class FileUpload(db.Model):
"""File uploads tracking table"""
__tablename__ = 'file_uploads'
id = db.Column(db.Integer, primary_key=True)
filename = db.Column(db.String(1024), nullable=False, index=True)
original_filename = db.Column(db.String(1024), nullable=False)
file_path = db.Column(db.String(2048), nullable=False)
file_size = db.Column(db.BigInteger, nullable=False)
file_type = db.Column(db.Enum('fixture', 'zip', name='file_type'), nullable=False, index=True)
mime_type = db.Column(db.String(255), nullable=False)
sha1sum = db.Column(db.String(255), nullable=False, index=True)
upload_status = db.Column(db.Enum('uploading', 'completed', 'failed', 'processing', name='upload_status'),
default='uploading', index=True)
upload_progress = db.Column(db.Numeric(5, 2), default=0.00)
error_message = db.Column(db.Text)
# Associated match (for ZIP files)
match_id = db.Column(db.Integer, db.ForeignKey('matches.id'), index=True)
# User tracking
uploaded_by = db.Column(db.Integer, db.ForeignKey('users.id'), index=True)
created_at = db.Column(db.DateTime, default=datetime.utcnow)
updated_at = db.Column(db.DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
# Relationships
logs = db.relationship('SystemLog', backref='upload', lazy='dynamic', foreign_keys='SystemLog.upload_id')
def update_progress(self, progress, status=None):
"""Update upload progress"""
self.upload_progress = progress
if status:
self.upload_status = status
self.updated_at = datetime.utcnow()
db.session.commit()
def mark_completed(self):
"""Mark upload as completed"""
self.upload_status = 'completed'
self.upload_progress = 100.00
self.updated_at = datetime.utcnow()
db.session.commit()
def mark_failed(self, error_message):
"""Mark upload as failed"""
self.upload_status = 'failed'
self.error_message = error_message
self.updated_at = datetime.utcnow()
db.session.commit()
def to_dict(self):
"""Convert to dictionary for JSON serialization"""
return {
'id': self.id,
'filename': self.filename,
'original_filename': self.original_filename,
'file_size': self.file_size,
'file_type': self.file_type,
'mime_type': self.mime_type,
'sha1sum': self.sha1sum,
'upload_status': self.upload_status,
'upload_progress': float(self.upload_progress) if self.upload_progress else 0.0,
'error_message': self.error_message,
'match_id': self.match_id,
'created_at': self.created_at.isoformat() if self.created_at else None,
'updated_at': self.updated_at.isoformat() if self.updated_at else None
}
def __repr__(self):
return f'<FileUpload {self.filename} ({self.file_type})>'
class SystemLog(db.Model):
"""System logs table for comprehensive logging"""
__tablename__ = 'system_logs'
id = db.Column(db.Integer, primary_key=True)
level = db.Column(db.Enum('DEBUG', 'INFO', 'WARNING', 'ERROR', 'CRITICAL', name='log_level'),
nullable=False, index=True)
message = db.Column(db.Text, nullable=False)
module = db.Column(db.String(255), index=True)
function_name = db.Column(db.String(255))
line_number = db.Column(db.Integer)
# Context information
user_id = db.Column(db.Integer, db.ForeignKey('users.id'), index=True)
match_id = db.Column(db.Integer, db.ForeignKey('matches.id'), index=True)
upload_id = db.Column(db.Integer, db.ForeignKey('file_uploads.id'), index=True)
session_id = db.Column(db.String(255))
ip_address = db.Column(db.String(45))
user_agent = db.Column(db.Text)
# Additional metadata
extra_data = db.Column(db.JSON)
created_at = db.Column(db.DateTime, default=datetime.utcnow, index=True)
@classmethod
def log(cls, level, message, **kwargs):
"""Create a log entry"""
log_entry = cls(
level=level,
message=message,
**kwargs
)
db.session.add(log_entry)
db.session.commit()
return log_entry
def to_dict(self):
"""Convert to dictionary for JSON serialization"""
return {
'id': self.id,
'level': self.level,
'message': self.message,
'module': self.module,
'function_name': self.function_name,
'line_number': self.line_number,
'user_id': self.user_id,
'match_id': self.match_id,
'upload_id': self.upload_id,
'session_id': self.session_id,
'ip_address': self.ip_address,
'extra_data': self.extra_data,
'created_at': self.created_at.isoformat() if self.created_at else None
}
def __repr__(self):
return f'<SystemLog {self.level}: {self.message[:50]}...>'
class UserSession(db.Model):
"""Session management table"""
__tablename__ = 'user_sessions'
id = db.Column(db.Integer, primary_key=True)
session_id = db.Column(db.String(255), unique=True, nullable=False, index=True)
user_id = db.Column(db.Integer, db.ForeignKey('users.id'), nullable=False, index=True)
ip_address = db.Column(db.String(45), nullable=False)
user_agent = db.Column(db.Text)
is_active = db.Column(db.Boolean, default=True, index=True)
expires_at = db.Column(db.DateTime, nullable=False, index=True)
created_at = db.Column(db.DateTime, default=datetime.utcnow)
last_activity = db.Column(db.DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
def __init__(self, **kwargs):
super(UserSession, self).__init__(**kwargs)
if not self.session_id:
self.session_id = str(uuid.uuid4())
if not self.expires_at:
self.expires_at = datetime.utcnow() + timedelta(hours=24)
def is_expired(self):
"""Check if session is expired"""
return datetime.utcnow() > self.expires_at
def extend_session(self, hours=24):
"""Extend session expiration"""
self.expires_at = datetime.utcnow() + timedelta(hours=hours)
self.last_activity = datetime.utcnow()
db.session.commit()
def deactivate(self):
"""Deactivate session"""
self.is_active = False
db.session.commit()
def to_dict(self):
"""Convert to dictionary for JSON serialization"""
return {
'id': self.id,
'session_id': self.session_id,
'user_id': self.user_id,
'ip_address': self.ip_address,
'is_active': self.is_active,
'expires_at': self.expires_at.isoformat() if self.expires_at else None,
'created_at': self.created_at.isoformat() if self.created_at else None,
'last_activity': self.last_activity.isoformat() if self.last_activity else None
}
def __repr__(self):
return f'<UserSession {self.session_id} for User {self.user_id}>'
\ No newline at end of file
from flask import Blueprint
bp = Blueprint('upload', __name__)
from app.upload import routes
\ No newline at end of file
import os
import hashlib
import threading
import time
from datetime import datetime
from concurrent.futures import ThreadPoolExecutor, as_completed
from flask import current_app
from werkzeug.utils import secure_filename
from app import db
from app.models import FileUpload, Match
from app.utils.security import sanitize_filename, validate_file_type, validate_file_size, detect_malicious_content
from app.utils.logging import log_file_operation, log_upload_progress
import logging
logger = logging.getLogger(__name__)
class FileUploadHandler:
"""Handle file uploads with progress tracking and security validation"""
def __init__(self):
self.upload_folder = current_app.config['UPLOAD_FOLDER']
self.chunk_size = current_app.config.get('CHUNK_SIZE', 8192)
self.max_concurrent_uploads = current_app.config.get('MAX_CONCURRENT_UPLOADS', 5)
self.executor = ThreadPoolExecutor(max_workers=self.max_concurrent_uploads)
self.active_uploads = {}
self.upload_lock = threading.Lock()
def validate_upload(self, file, file_type):
"""
Validate file upload
Args:
file: Werkzeug FileStorage object
file_type: Expected file type ('fixture' or 'zip')
Returns:
tuple: (is_valid, error_message)
"""
try:
# Check if file is present
if not file or not file.filename:
return False, "No file provided"
# Validate file type
if file_type == 'fixture':
allowed_extensions = current_app.config['ALLOWED_FIXTURE_EXTENSIONS']
elif file_type == 'zip':
allowed_extensions = current_app.config['ALLOWED_ZIP_EXTENSIONS']
else:
return False, "Invalid file type specified"
if not validate_file_type(file.filename, allowed_extensions):
return False, f"File type not allowed. Allowed types: {', '.join(allowed_extensions)}"
# Validate file size
file.seek(0, os.SEEK_END)
file_size = file.tell()
file.seek(0)
if not validate_file_size(file_size):
max_size_mb = current_app.config.get('MAX_CONTENT_LENGTH', 500 * 1024 * 1024) // (1024 * 1024)
return False, f"File too large. Maximum size: {max_size_mb}MB"
return True, None
except Exception as e:
logger.error(f"File validation error: {str(e)}")
return False, "File validation failed"
def calculate_sha1(self, file_path):
"""
Calculate SHA1 checksum of file
Args:
file_path: Path to file
Returns:
str: SHA1 checksum in hexadecimal
"""
sha1_hash = hashlib.sha1()
try:
with open(file_path, 'rb') as f:
for chunk in iter(lambda: f.read(self.chunk_size), b""):
sha1_hash.update(chunk)
return sha1_hash.hexdigest()
except Exception as e:
logger.error(f"SHA1 calculation error: {str(e)}")
return None
def save_file_chunked(self, file, file_path, upload_record):
"""
Save file in chunks with progress tracking
Args:
file: Werkzeug FileStorage object
file_path: Destination file path
upload_record: FileUpload database record
Returns:
bool: True if successful
"""
try:
file.seek(0, os.SEEK_END)
total_size = file.tell()
file.seek(0)
bytes_written = 0
with open(file_path, 'wb') as f:
while True:
chunk = file.read(self.chunk_size)
if not chunk:
break
f.write(chunk)
bytes_written += len(chunk)
# Update progress
progress = (bytes_written / total_size) * 100 if total_size > 0 else 100
upload_record.update_progress(progress)
# Log progress for large files (every 10%)
if total_size > 10 * 1024 * 1024 and progress % 10 < 1:
log_upload_progress(
upload_record.id,
round(progress, 2),
'uploading',
user_id=upload_record.uploaded_by,
match_id=upload_record.match_id
)
return True
except Exception as e:
logger.error(f"File save error: {str(e)}")
upload_record.mark_failed(f"File save failed: {str(e)}")
return False
def process_upload(self, file, file_type, user_id, match_id=None):
"""
Process file upload with validation and progress tracking
Args:
file: Werkzeug FileStorage object
file_type: Type of file ('fixture' or 'zip')
user_id: ID of user uploading file
match_id: Associated match ID (for ZIP files)
Returns:
tuple: (upload_record, error_message)
"""
try:
# Validate upload
is_valid, error_message = self.validate_upload(file, file_type)
if not is_valid:
return None, error_message
# Generate secure filename
original_filename = file.filename
sanitized_name = sanitize_filename(original_filename)
timestamp = datetime.utcnow().strftime('%Y%m%d_%H%M%S')
filename = f"{timestamp}_{sanitized_name}"
# Create upload directory if it doesn't exist
os.makedirs(self.upload_folder, exist_ok=True)
file_path = os.path.join(self.upload_folder, filename)
# Get file size and MIME type
file.seek(0, os.SEEK_END)
file_size = file.tell()
file.seek(0)
mime_type = file.content_type or 'application/octet-stream'
# Create upload record
upload_record = FileUpload(
filename=filename,
original_filename=original_filename,
file_path=file_path,
file_size=file_size,
file_type=file_type,
mime_type=mime_type,
sha1sum='', # Will be calculated after upload
upload_status='uploading',
match_id=match_id,
uploaded_by=user_id
)
db.session.add(upload_record)
db.session.commit()
# Save file with progress tracking
if not self.save_file_chunked(file, file_path, upload_record):
return upload_record, "File save failed"
# Calculate SHA1 checksum
sha1_checksum = self.calculate_sha1(file_path)
if not sha1_checksum:
upload_record.mark_failed("Checksum calculation failed")
return upload_record, "Checksum calculation failed"
# Check for malicious content
if detect_malicious_content(file_path):
os.remove(file_path)
upload_record.mark_failed("Potentially malicious content detected")
log_file_operation(
'MALICIOUS_CONTENT_DETECTED',
filename,
user_id=user_id,
upload_id=upload_record.id,
status='BLOCKED'
)
return upload_record, "File blocked due to security concerns"
# Update upload record with checksum
upload_record.sha1sum = sha1_checksum
upload_record.mark_completed()
log_file_operation(
'UPLOAD_COMPLETED',
filename,
user_id=user_id,
match_id=match_id,
upload_id=upload_record.id,
extra_data={
'file_size': file_size,
'file_type': file_type,
'sha1sum': sha1_checksum
}
)
return upload_record, None
except Exception as e:
logger.error(f"Upload processing error: {str(e)}")
if 'upload_record' in locals():
upload_record.mark_failed(f"Upload processing failed: {str(e)}")
return None, f"Upload processing failed: {str(e)}"
def upload_file_async(self, file, file_type, user_id, match_id=None):
"""
Upload file asynchronously
Args:
file: Werkzeug FileStorage object
file_type: Type of file ('fixture' or 'zip')
user_id: ID of user uploading file
match_id: Associated match ID (for ZIP files)
Returns:
str: Upload ID for tracking
"""
upload_id = f"{user_id}_{int(time.time())}"
with self.upload_lock:
if len(self.active_uploads) >= self.max_concurrent_uploads:
return None, "Maximum concurrent uploads reached"
future = self.executor.submit(
self.process_upload, file, file_type, user_id, match_id
)
self.active_uploads[upload_id] = future
return upload_id, None
def get_upload_status(self, upload_id):
"""
Get status of async upload
Args:
upload_id: Upload ID
Returns:
dict: Upload status information
"""
with self.upload_lock:
if upload_id not in self.active_uploads:
return {'status': 'not_found'}
future = self.active_uploads[upload_id]
if future.done():
try:
upload_record, error_message = future.result()
del self.active_uploads[upload_id]
if error_message:
return {
'status': 'failed',
'error': error_message,
'upload_record': upload_record.to_dict() if upload_record else None
}
else:
return {
'status': 'completed',
'upload_record': upload_record.to_dict()
}
except Exception as e:
del self.active_uploads[upload_id]
return {
'status': 'failed',
'error': str(e)
}
else:
return {'status': 'uploading'}
def cancel_upload(self, upload_id):
"""
Cancel async upload
Args:
upload_id: Upload ID
Returns:
bool: True if cancelled successfully
"""
with self.upload_lock:
if upload_id in self.active_uploads:
future = self.active_uploads[upload_id]
cancelled = future.cancel()
if cancelled:
del self.active_uploads[upload_id]
return cancelled
return False
def cleanup_failed_uploads(self):
"""Clean up failed upload files"""
try:
failed_uploads = FileUpload.query.filter_by(upload_status='failed').all()
for upload in failed_uploads:
if os.path.exists(upload.file_path):
try:
os.remove(upload.file_path)
logger.info(f"Cleaned up failed upload file: {upload.filename}")
except Exception as e:
logger.error(f"Failed to clean up file {upload.filename}: {str(e)}")
except Exception as e:
logger.error(f"Upload cleanup error: {str(e)}")
def get_upload_statistics(self):
"""Get upload statistics"""
try:
stats = {
'total_uploads': FileUpload.query.count(),
'completed_uploads': FileUpload.query.filter_by(upload_status='completed').count(),
'failed_uploads': FileUpload.query.filter_by(upload_status='failed').count(),
'active_uploads': len(self.active_uploads),
'fixture_uploads': FileUpload.query.filter_by(file_type='fixture').count(),
'zip_uploads': FileUpload.query.filter_by(file_type='zip').count()
}
# Calculate total storage used
completed_uploads = FileUpload.query.filter_by(upload_status='completed').all()
total_storage = sum(upload.file_size for upload in completed_uploads)
stats['total_storage_bytes'] = total_storage
stats['total_storage_mb'] = round(total_storage / (1024 * 1024), 2)
return stats
except Exception as e:
logger.error(f"Statistics calculation error: {str(e)}")
return {}
# Global file upload handler instance
file_upload_handler = FileUploadHandler()
def get_file_upload_handler():
"""Get file upload handler instance"""
return file_upload_handler
\ No newline at end of file
import pandas as pd
import logging
import re
from datetime import datetime
from typing import Dict, List, Tuple, Optional
from app import db
from app.models import Match, MatchOutcome
from app.utils.logging import log_file_operation, log_database_operation
logger = logging.getLogger(__name__)
class FixtureParser:
"""Parse CSV/XLSX fixture files with intelligent column detection"""
# Required columns mapping
REQUIRED_COLUMNS = {
'match_number': ['match #', 'match_number', 'match no', 'match_no', 'match'],
'fighter1': ['fighter1 (township)', 'fighter1', 'fighter 1', 'fighter_1'],
'fighter2': ['fighter2 (township)', 'fighter2', 'fighter 2', 'fighter_2'],
'venue': ['venue (kampala township)', 'venue', 'location', 'kampala township']
}
def __init__(self):
self.supported_formats = ['.csv', '.xlsx', '.xls']
self.encoding_options = ['utf-8', 'latin-1', 'cp1252', 'iso-8859-1']
def detect_file_format(self, file_path: str) -> str:
"""
Detect file format from extension
Args:
file_path: Path to the file
Returns:
str: File format ('csv', 'xlsx', 'xls')
"""
extension = file_path.lower().split('.')[-1]
if extension in ['xlsx', 'xls']:
return extension
return 'csv'
def read_file(self, file_path: str) -> Optional[pd.DataFrame]:
"""
Read file into pandas DataFrame with format detection
Args:
file_path: Path to the file
Returns:
pd.DataFrame or None: Parsed data or None if failed
"""
try:
file_format = self.detect_file_format(file_path)
if file_format == 'csv':
# Try different encodings for CSV
for encoding in self.encoding_options:
try:
df = pd.read_csv(file_path, encoding=encoding)
logger.info(f"Successfully read CSV with encoding: {encoding}")
return df
except UnicodeDecodeError:
continue
except Exception as e:
logger.error(f"CSV read error with {encoding}: {str(e)}")
continue
# If all encodings fail, try with error handling
try:
df = pd.read_csv(file_path, encoding='utf-8', errors='replace')
logger.warning("Read CSV with character replacement due to encoding issues")
return df
except Exception as e:
logger.error(f"Final CSV read attempt failed: {str(e)}")
return None
elif file_format in ['xlsx', 'xls']:
try:
# Try to read Excel file
df = pd.read_excel(file_path, engine='openpyxl' if file_format == 'xlsx' else 'xlrd')
logger.info(f"Successfully read {file_format.upper()} file")
return df
except Exception as e:
logger.error(f"Excel read error: {str(e)}")
return None
else:
logger.error(f"Unsupported file format: {file_format}")
return None
except Exception as e:
logger.error(f"File read error: {str(e)}")
return None
def normalize_column_name(self, column_name: str) -> str:
"""
Normalize column name for comparison
Args:
column_name: Original column name
Returns:
str: Normalized column name
"""
if pd.isna(column_name):
return ''
# Convert to string and lowercase
normalized = str(column_name).lower().strip()
# Remove extra whitespace and special characters
normalized = re.sub(r'\s+', ' ', normalized)
normalized = re.sub(r'[^\w\s()]', '', normalized)
return normalized
def detect_required_columns(self, df: pd.DataFrame) -> Dict[str, str]:
"""
Detect required columns in the DataFrame
Args:
df: Input DataFrame
Returns:
dict: Mapping of required field to actual column name
"""
column_mapping = {}
normalized_columns = {self.normalize_column_name(col): col for col in df.columns}
for field, possible_names in self.REQUIRED_COLUMNS.items():
found = False
for possible_name in possible_names:
normalized_possible = self.normalize_column_name(possible_name)
if normalized_possible in normalized_columns:
column_mapping[field] = normalized_columns[normalized_possible]
found = True
break
if not found:
logger.warning(f"Required column not found for field: {field}")
# Try partial matching
for col_name in normalized_columns:
for possible_name in possible_names:
if possible_name.split()[0] in col_name:
column_mapping[field] = normalized_columns[col_name]
logger.info(f"Found partial match for {field}: {col_name}")
found = True
break
if found:
break
return column_mapping
def detect_outcome_columns(self, df: pd.DataFrame, required_columns: Dict[str, str]) -> List[str]:
"""
Detect optional outcome columns (numeric columns not in required set)
Args:
df: Input DataFrame
required_columns: Already detected required columns
Returns:
list: List of outcome column names
"""
outcome_columns = []
required_column_names = set(required_columns.values())
for column in df.columns:
if column not in required_column_names:
# Check if column contains numeric data
try:
# Try to convert to numeric, ignoring errors
numeric_series = pd.to_numeric(df[column], errors='coerce')
# If more than 50% of non-null values are numeric, consider it an outcome column
non_null_count = numeric_series.count()
total_non_null = df[column].count()
if total_non_null > 0 and (non_null_count / total_non_null) >= 0.5:
outcome_columns.append(column)
logger.info(f"Detected outcome column: {column}")
except Exception:
continue
return outcome_columns
def validate_required_data(self, df: pd.DataFrame, column_mapping: Dict[str, str]) -> Tuple[bool, List[str]]:
"""
Validate that all required columns are present and have data
Args:
df: Input DataFrame
column_mapping: Mapping of required fields to column names
Returns:
tuple: (is_valid, list_of_errors)
"""
errors = []
# Check if all required fields are mapped
for field in self.REQUIRED_COLUMNS.keys():
if field not in column_mapping:
errors.append(f"Required field '{field}' not found in file")
if errors:
return False, errors
# Check for data in required columns
for field, column_name in column_mapping.items():
if column_name not in df.columns:
errors.append(f"Column '{column_name}' not found in DataFrame")
continue
# Check for empty values
null_count = df[column_name].isnull().sum()
if null_count > 0:
errors.append(f"Column '{column_name}' has {null_count} empty values")
# Special validation for match_number (should be integers)
if field == 'match_number':
try:
# Try to convert to numeric
numeric_values = pd.to_numeric(df[column_name], errors='coerce')
invalid_count = numeric_values.isnull().sum()
if invalid_count > 0:
errors.append(f"Match number column has {invalid_count} non-numeric values")
except Exception as e:
errors.append(f"Match number validation failed: {str(e)}")
return len(errors) == 0, errors
def parse_fixture_file(self, file_path: str, filename: str, user_id: int) -> Tuple[bool, str, List[Dict]]:
"""
Parse fixture file and extract match data
Args:
file_path: Path to the fixture file
filename: Original filename
user_id: ID of user uploading the file
Returns:
tuple: (success, error_message, parsed_matches)
"""
try:
log_file_operation('FIXTURE_PARSE_START', filename, user_id=user_id)
# Read file
df = self.read_file(file_path)
if df is None:
error_msg = "Failed to read fixture file"
log_file_operation('FIXTURE_PARSE_FAILED', filename, user_id=user_id,
status='FAILED', error_message=error_msg)
return False, error_msg, []
logger.info(f"Read {len(df)} rows from fixture file")
# Remove completely empty rows
df = df.dropna(how='all')
if len(df) == 0:
error_msg = "No data found in fixture file"
log_file_operation('FIXTURE_PARSE_FAILED', filename, user_id=user_id,
status='FAILED', error_message=error_msg)
return False, error_msg, []
# Detect required columns
column_mapping = self.detect_required_columns(df)
# Validate required data
is_valid, validation_errors = self.validate_required_data(df, column_mapping)
if not is_valid:
error_msg = f"Validation failed: {'; '.join(validation_errors)}"
log_file_operation('FIXTURE_PARSE_FAILED', filename, user_id=user_id,
status='FAILED', error_message=error_msg)
return False, error_msg, []
# Detect outcome columns
outcome_columns = self.detect_outcome_columns(df, column_mapping)
# Parse matches
parsed_matches = []
for index, row in df.iterrows():
try:
# Extract required fields
match_data = {
'match_number': int(pd.to_numeric(row[column_mapping['match_number']])),
'fighter1_township': str(row[column_mapping['fighter1']]).strip(),
'fighter2_township': str(row[column_mapping['fighter2']]).strip(),
'venue_kampala_township': str(row[column_mapping['venue']]).strip(),
'filename': filename,
'created_by': user_id,
'outcomes': {}
}
# Extract outcome data
for outcome_col in outcome_columns:
try:
value = pd.to_numeric(row[outcome_col], errors='coerce')
if not pd.isna(value):
match_data['outcomes'][outcome_col] = float(value)
except Exception as e:
logger.warning(f"Failed to parse outcome {outcome_col} for match {match_data['match_number']}: {str(e)}")
parsed_matches.append(match_data)
except Exception as e:
logger.error(f"Failed to parse row {index}: {str(e)}")
continue
if not parsed_matches:
error_msg = "No valid matches found in fixture file"
log_file_operation('FIXTURE_PARSE_FAILED', filename, user_id=user_id,
status='FAILED', error_message=error_msg)
return False, error_msg, []
log_file_operation('FIXTURE_PARSE_SUCCESS', filename, user_id=user_id,
extra_data={
'matches_parsed': len(parsed_matches),
'outcome_columns': outcome_columns
})
logger.info(f"Successfully parsed {len(parsed_matches)} matches from {filename}")
return True, None, parsed_matches
except Exception as e:
error_msg = f"Fixture parsing failed: {str(e)}"
logger.error(error_msg)
log_file_operation('FIXTURE_PARSE_ERROR', filename, user_id=user_id,
status='ERROR', error_message=error_msg)
return False, error_msg, []
def save_matches_to_database(self, parsed_matches: List[Dict], file_sha1sum: str) -> Tuple[bool, str, List[int]]:
"""
Save parsed matches to database
Args:
parsed_matches: List of parsed match data
file_sha1sum: SHA1 checksum of the fixture file
Returns:
tuple: (success, error_message, list_of_match_ids)
"""
try:
saved_match_ids = []
for match_data in parsed_matches:
try:
# Check if match number already exists
existing_match = Match.query.filter_by(match_number=match_data['match_number']).first()
if existing_match:
logger.warning(f"Match number {match_data['match_number']} already exists, skipping")
continue
# Create match record
match = Match(
match_number=match_data['match_number'],
fighter1_township=match_data['fighter1_township'],
fighter2_township=match_data['fighter2_township'],
venue_kampala_township=match_data['venue_kampala_township'],
filename=match_data['filename'],
file_sha1sum=file_sha1sum,
created_by=match_data['created_by']
)
db.session.add(match)
db.session.flush() # Get the ID without committing
# Add outcome records
for outcome_name, outcome_value in match_data['outcomes'].items():
outcome = match.add_outcome(outcome_name, outcome_value)
db.session.add(outcome)
saved_match_ids.append(match.id)
log_database_operation('CREATE', 'matches', match.id,
user_id=match_data['created_by'])
except Exception as e:
logger.error(f"Failed to save match {match_data.get('match_number', 'unknown')}: {str(e)}")
db.session.rollback()
continue
# Commit all changes
db.session.commit()
logger.info(f"Successfully saved {len(saved_match_ids)} matches to database")
return True, None, saved_match_ids
except Exception as e:
db.session.rollback()
error_msg = f"Database save failed: {str(e)}"
logger.error(error_msg)
return False, error_msg, []
def get_parsing_statistics(self) -> Dict:
"""Get fixture parsing statistics"""
try:
stats = {
'total_matches': Match.query.count(),
'active_matches': Match.query.filter_by(active_status=True).count(),
'pending_zip_uploads': Match.query.filter_by(zip_upload_status='pending').count(),
'completed_uploads': Match.query.filter_by(zip_upload_status='completed').count(),
'failed_uploads': Match.query.filter_by(zip_upload_status='failed').count(),
'total_outcomes': MatchOutcome.query.count()
}
# Get unique filenames
unique_files = db.session.query(Match.filename).distinct().count()
stats['unique_fixture_files'] = unique_files
return stats
except Exception as e:
logger.error(f"Statistics calculation error: {str(e)}")
return {}
# Global fixture parser instance
fixture_parser = FixtureParser()
def get_fixture_parser():
"""Get fixture parser instance"""
return fixture_parser
\ No newline at end of file
from flask_wtf import FlaskForm
from flask_wtf.file import FileField, FileRequired, FileAllowed
from wtforms import SubmitField, TextAreaField
from wtforms.validators import Optional
class FixtureUploadForm(FlaskForm):
"""Form for uploading fixture files (CSV/XLSX)"""
fixture_file = FileField('Fixture File', validators=[
FileRequired(message='Please select a fixture file'),
FileAllowed(['csv', 'xlsx', 'xls'],
message='Only CSV and Excel files are allowed')
])
description = TextAreaField('Description (Optional)', validators=[Optional()])
submit = SubmitField('Upload Fixture')
class ZipUploadForm(FlaskForm):
"""Form for uploading ZIP files for matches"""
zip_file = FileField('ZIP File', validators=[
FileRequired(message='Please select a ZIP file'),
FileAllowed(['zip'], message='Only ZIP files are allowed')
])
description = TextAreaField('Description (Optional)', validators=[Optional()])
submit = SubmitField('Upload ZIP')
\ No newline at end of file
import os
import logging
from flask import request, jsonify, render_template, redirect, url_for, flash, current_app
from flask_login import login_required, current_user
from flask_jwt_extended import jwt_required, get_jwt_identity
from werkzeug.utils import secure_filename
from app.upload import bp
from app import db
from app.models import Match, FileUpload, User
from app.upload.file_handler import get_file_upload_handler
from app.upload.fixture_parser import get_fixture_parser
from app.utils.security import require_active_user, validate_file_type, hash_file_content
from app.utils.logging import log_file_operation, log_upload_progress
from app.upload.forms import FixtureUploadForm, ZipUploadForm
logger = logging.getLogger(__name__)
@bp.route('/fixture', methods=['GET', 'POST'])
@login_required
@require_active_user
def upload_fixture():
"""Upload fixture file (CSV/XLSX) - Web interface"""
form = FixtureUploadForm()
if form.validate_on_submit():
try:
file = form.fixture_file.data
if not file or not file.filename:
flash('No file selected', 'error')
return render_template('upload/fixture.html', form=form)
# Process upload
file_handler = get_file_upload_handler()
upload_record, error_message = file_handler.process_upload(
file, 'fixture', current_user.id
)
if error_message:
flash(f'Upload failed: {error_message}', 'error')
return render_template('upload/fixture.html', form=form)
# Parse fixture file
fixture_parser = get_fixture_parser()
success, parse_error, parsed_matches = fixture_parser.parse_fixture_file(
upload_record.file_path, upload_record.original_filename, current_user.id
)
if not success:
flash(f'Fixture parsing failed: {parse_error}', 'error')
return render_template('upload/fixture.html', form=form)
# Save matches to database
success, save_error, match_ids = fixture_parser.save_matches_to_database(
parsed_matches, upload_record.sha1sum
)
if not success:
flash(f'Database save failed: {save_error}', 'error')
return render_template('upload/fixture.html', form=form)
flash(f'Successfully uploaded and parsed {len(match_ids)} matches!', 'success')
return redirect(url_for('main.matches'))
except Exception as e:
logger.error(f"Fixture upload error: {str(e)}")
flash('Upload processing failed', 'error')
return render_template('upload/fixture.html', form=form)
@bp.route('/zip/<int:match_id>', methods=['GET', 'POST'])
@login_required
@require_active_user
def upload_zip(match_id):
"""Upload ZIP file for specific match - Web interface"""
match = Match.query.get_or_404(match_id)
# Check if ZIP already uploaded
if match.zip_upload_status == 'completed':
flash('ZIP file already uploaded for this match', 'info')
return redirect(url_for('main.match_detail', id=match_id))
form = ZipUploadForm()
if form.validate_on_submit():
try:
file = form.zip_file.data
if not file or not file.filename:
flash('No file selected', 'error')
return render_template('upload/zip.html', form=form, match=match)
# Update match status to uploading
match.zip_upload_status = 'uploading'
db.session.commit()
# Process upload
file_handler = get_file_upload_handler()
upload_record, error_message = file_handler.process_upload(
file, 'zip', current_user.id, match_id
)
if error_message:
match.zip_upload_status = 'failed'
db.session.commit()
flash(f'Upload failed: {error_message}', 'error')
return render_template('upload/zip.html', form=form, match=match)
# Update match with ZIP file information
match.zip_filename = upload_record.filename
match.zip_sha1sum = upload_record.sha1sum
match.zip_upload_status = 'completed'
match.zip_upload_progress = 100.00
# Set match as active (both fixture and ZIP uploaded)
match.set_active()
db.session.commit()
flash('ZIP file uploaded successfully! Match is now active.', 'success')
return redirect(url_for('main.match_detail', id=match_id))
except Exception as e:
logger.error(f"ZIP upload error: {str(e)}")
match.zip_upload_status = 'failed'
db.session.commit()
flash('Upload processing failed', 'error')
return render_template('upload/zip.html', form=form, match=match)
@bp.route('/api/fixture', methods=['POST'])
@jwt_required()
def api_upload_fixture():
"""Upload fixture file (CSV/XLSX) - API endpoint"""
try:
user_id = get_jwt_identity()
user = User.query.get(user_id)
if not user or not user.is_active:
return jsonify({'error': 'User not found or inactive'}), 404
if 'file' not in request.files:
return jsonify({'error': 'No file provided'}), 400
file = request.files['file']
if not file or not file.filename:
return jsonify({'error': 'No file selected'}), 400
# Process upload
file_handler = get_file_upload_handler()
upload_record, error_message = file_handler.process_upload(
file, 'fixture', user_id
)
if error_message:
return jsonify({'error': error_message}), 400
# Parse fixture file
fixture_parser = get_fixture_parser()
success, parse_error, parsed_matches = fixture_parser.parse_fixture_file(
upload_record.file_path, upload_record.original_filename, user_id
)
if not success:
return jsonify({'error': f'Fixture parsing failed: {parse_error}'}), 400
# Save matches to database
success, save_error, match_ids = fixture_parser.save_matches_to_database(
parsed_matches, upload_record.sha1sum
)
if not success:
return jsonify({'error': f'Database save failed: {save_error}'}), 500
return jsonify({
'message': 'Fixture uploaded and parsed successfully',
'upload_record': upload_record.to_dict(),
'matches_created': len(match_ids),
'match_ids': match_ids
}), 200
except Exception as e:
logger.error(f"API fixture upload error: {str(e)}")
return jsonify({'error': 'Upload processing failed'}), 500
@bp.route('/api/zip/<int:match_id>', methods=['POST'])
@jwt_required()
def api_upload_zip(match_id):
"""Upload ZIP file for specific match - API endpoint"""
try:
user_id = get_jwt_identity()
user = User.query.get(user_id)
if not user or not user.is_active:
return jsonify({'error': 'User not found or inactive'}), 404
match = Match.query.get(match_id)
if not match:
return jsonify({'error': 'Match not found'}), 404
# Check if ZIP already uploaded
if match.zip_upload_status == 'completed':
return jsonify({'error': 'ZIP file already uploaded for this match'}), 400
if 'file' not in request.files:
return jsonify({'error': 'No file provided'}), 400
file = request.files['file']
if not file or not file.filename:
return jsonify({'error': 'No file selected'}), 400
# Update match status to uploading
match.zip_upload_status = 'uploading'
db.session.commit()
# Process upload
file_handler = get_file_upload_handler()
upload_record, error_message = file_handler.process_upload(
file, 'zip', user_id, match_id
)
if error_message:
match.zip_upload_status = 'failed'
db.session.commit()
return jsonify({'error': error_message}), 400
# Update match with ZIP file information
match.zip_filename = upload_record.filename
match.zip_sha1sum = upload_record.sha1sum
match.zip_upload_status = 'completed'
match.zip_upload_progress = 100.00
# Set match as active (both fixture and ZIP uploaded)
match.set_active()
db.session.commit()
return jsonify({
'message': 'ZIP file uploaded successfully',
'upload_record': upload_record.to_dict(),
'match': match.to_dict()
}), 200
except Exception as e:
logger.error(f"API ZIP upload error: {str(e)}")
if 'match' in locals():
match.zip_upload_status = 'failed'
db.session.commit()
return jsonify({'error': 'Upload processing failed'}), 500
@bp.route('/api/upload-async', methods=['POST'])
@jwt_required()
def api_upload_async():
"""Start asynchronous file upload"""
try:
user_id = get_jwt_identity()
user = User.query.get(user_id)
if not user or not user.is_active:
return jsonify({'error': 'User not found or inactive'}), 404
if 'file' not in request.files:
return jsonify({'error': 'No file provided'}), 400
file = request.files['file']
file_type = request.form.get('file_type', 'fixture')
match_id = request.form.get('match_id')
if not file or not file.filename:
return jsonify({'error': 'No file selected'}), 400
if file_type not in ['fixture', 'zip']:
return jsonify({'error': 'Invalid file type'}), 400
if file_type == 'zip' and not match_id:
return jsonify({'error': 'Match ID required for ZIP uploads'}), 400
# Start async upload
file_handler = get_file_upload_handler()
upload_id, error_message = file_handler.upload_file_async(
file, file_type, user_id, int(match_id) if match_id else None
)
if error_message:
return jsonify({'error': error_message}), 400
return jsonify({
'upload_id': upload_id,
'message': 'Upload started successfully'
}), 202
except Exception as e:
logger.error(f"Async upload start error: {str(e)}")
return jsonify({'error': 'Upload start failed'}), 500
@bp.route('/api/upload-status/<upload_id>', methods=['GET'])
@jwt_required()
def api_upload_status(upload_id):
"""Get status of asynchronous upload"""
try:
file_handler = get_file_upload_handler()
status = file_handler.get_upload_status(upload_id)
return jsonify(status), 200
except Exception as e:
logger.error(f"Upload status error: {str(e)}")
return jsonify({'error': 'Status check failed'}), 500
@bp.route('/api/upload-cancel/<upload_id>', methods=['POST'])
@jwt_required()
def api_upload_cancel(upload_id):
"""Cancel asynchronous upload"""
try:
file_handler = get_file_upload_handler()
cancelled = file_handler.cancel_upload(upload_id)
if cancelled:
return jsonify({'message': 'Upload cancelled successfully'}), 200
else:
return jsonify({'error': 'Upload could not be cancelled'}), 400
except Exception as e:
logger.error(f"Upload cancel error: {str(e)}")
return jsonify({'error': 'Cancel operation failed'}), 500
@bp.route('/api/progress/<int:upload_id>', methods=['GET'])
@jwt_required()
def api_upload_progress(upload_id):
"""Get upload progress for specific upload record"""
try:
user_id = get_jwt_identity()
upload_record = FileUpload.query.filter_by(
id=upload_id,
uploaded_by=user_id
).first()
if not upload_record:
return jsonify({'error': 'Upload not found'}), 404
return jsonify({
'upload_id': upload_record.id,
'progress': float(upload_record.upload_progress),
'status': upload_record.upload_status,
'filename': upload_record.original_filename,
'file_size': upload_record.file_size,
'error_message': upload_record.error_message
}), 200
except Exception as e:
logger.error(f"Progress check error: {str(e)}")
return jsonify({'error': 'Progress check failed'}), 500
@bp.route('/api/uploads', methods=['GET'])
@jwt_required()
def api_list_uploads():
"""List user's uploads with pagination"""
try:
user_id = get_jwt_identity()
page = request.args.get('page', 1, type=int)
per_page = min(request.args.get('per_page', 20, type=int), 100)
file_type = request.args.get('file_type')
status = request.args.get('status')
query = FileUpload.query.filter_by(uploaded_by=user_id)
if file_type:
query = query.filter_by(file_type=file_type)
if status:
query = query.filter_by(upload_status=status)
uploads = query.order_by(FileUpload.created_at.desc()).paginate(
page=page, per_page=per_page, error_out=False
)
return jsonify({
'uploads': [upload.to_dict() for upload in uploads.items],
'pagination': {
'page': page,
'pages': uploads.pages,
'per_page': per_page,
'total': uploads.total,
'has_next': uploads.has_next,
'has_prev': uploads.has_prev
}
}), 200
except Exception as e:
logger.error(f"Upload list error: {str(e)}")
return jsonify({'error': 'Upload list failed'}), 500
@bp.route('/api/statistics', methods=['GET'])
@jwt_required()
def api_upload_statistics():
"""Get upload statistics"""
try:
user_id = get_jwt_identity()
user = User.query.get(user_id)
if not user:
return jsonify({'error': 'User not found'}), 404
file_handler = get_file_upload_handler()
fixture_parser = get_fixture_parser()
upload_stats = file_handler.get_upload_statistics()
parsing_stats = fixture_parser.get_parsing_statistics()
# User-specific statistics
user_uploads = FileUpload.query.filter_by(uploaded_by=user_id).count()
user_matches = Match.query.filter_by(created_by=user_id).count()
stats = {
'global': {
'upload_stats': upload_stats,
'parsing_stats': parsing_stats
},
'user': {
'total_uploads': user_uploads,
'total_matches_created': user_matches
}
}
return jsonify(stats), 200
except Exception as e:
logger.error(f"Statistics error: {str(e)}")
return jsonify({'error': 'Statistics calculation failed'}), 500
@bp.route('/api/cleanup', methods=['POST'])
@jwt_required()
def api_cleanup_uploads():
"""Clean up failed uploads (admin only)"""
try:
user_id = get_jwt_identity()
user = User.query.get(user_id)
if not user or not user.is_admin:
return jsonify({'error': 'Admin privileges required'}), 403
file_handler = get_file_upload_handler()
file_handler.cleanup_failed_uploads()
return jsonify({'message': 'Cleanup completed successfully'}), 200
except Exception as e:
logger.error(f"Cleanup error: {str(e)}")
return jsonify({'error': 'Cleanup failed'}), 500
\ No newline at end of file
# Utilities package
\ No newline at end of file
import logging
import json
from datetime import datetime
from flask import request, current_app
from app import db
from app.models import SystemLog
logger = logging.getLogger(__name__)
def log_security_event(event_type, ip_address, user_id=None, username=None, extra_data=None):
"""
Log security-related events
Args:
event_type: Type of security event
ip_address: Client IP address
user_id: User ID (if applicable)
username: Username (if applicable)
extra_data: Additional data to log
"""
try:
message = f"Security Event: {event_type}"
if username:
message += f" - Username: {username}"
extra_info = {
'event_type': event_type,
'ip_address': ip_address,
'user_agent': request.headers.get('User-Agent', '') if request else '',
'timestamp': datetime.utcnow().isoformat()
}
if username:
extra_info['username'] = username
if extra_data:
extra_info.update(extra_data)
# Log to database
SystemLog.log(
level='INFO' if event_type.endswith('_SUCCESS') else 'WARNING',
message=message,
module='security',
user_id=user_id,
ip_address=ip_address,
user_agent=request.headers.get('User-Agent', '') if request else None,
extra_data=extra_info
)
# Also log to application logger
logger.info(f"{message} - IP: {ip_address}")
except Exception as e:
logger.error(f"Failed to log security event: {str(e)}")
def log_file_operation(operation_type, filename, user_id=None, match_id=None, upload_id=None,
status='SUCCESS', error_message=None, extra_data=None):
"""
Log file operations
Args:
operation_type: Type of file operation
filename: Name of the file
user_id: User ID performing operation
match_id: Associated match ID
upload_id: Associated upload ID
status: Operation status
error_message: Error message if failed
extra_data: Additional data to log
"""
try:
message = f"File Operation: {operation_type} - {filename}"
if status != 'SUCCESS':
message += f" - Status: {status}"
extra_info = {
'operation_type': operation_type,
'filename': filename,
'status': status,
'timestamp': datetime.utcnow().isoformat()
}
if error_message:
extra_info['error_message'] = error_message
if extra_data:
extra_info.update(extra_data)
# Determine log level based on status
log_level = 'INFO' if status == 'SUCCESS' else 'ERROR'
# Log to database
SystemLog.log(
level=log_level,
message=message,
module='file_operations',
user_id=user_id,
match_id=match_id,
upload_id=upload_id,
ip_address=request.environ.get('HTTP_X_REAL_IP', request.remote_addr) if request else None,
extra_data=extra_info
)
# Also log to application logger
if status == 'SUCCESS':
logger.info(message)
else:
logger.error(f"{message} - Error: {error_message}")
except Exception as e:
logger.error(f"Failed to log file operation: {str(e)}")
def log_database_operation(operation_type, table_name, record_id=None, user_id=None,
status='SUCCESS', error_message=None, extra_data=None):
"""
Log database operations
Args:
operation_type: Type of database operation (CREATE, UPDATE, DELETE)
table_name: Name of the table
record_id: ID of the affected record
user_id: User ID performing operation
status: Operation status
error_message: Error message if failed
extra_data: Additional data to log
"""
try:
message = f"Database Operation: {operation_type} on {table_name}"
if record_id:
message += f" (ID: {record_id})"
extra_info = {
'operation_type': operation_type,
'table_name': table_name,
'record_id': record_id,
'status': status,
'timestamp': datetime.utcnow().isoformat()
}
if error_message:
extra_info['error_message'] = error_message
if extra_data:
extra_info.update(extra_data)
# Determine log level based on status
log_level = 'INFO' if status == 'SUCCESS' else 'ERROR'
# Log to database (avoid recursion for SystemLog operations)
if table_name != 'system_logs':
SystemLog.log(
level=log_level,
message=message,
module='database',
user_id=user_id,
ip_address=request.environ.get('HTTP_X_REAL_IP', request.remote_addr) if request else None,
extra_data=extra_info
)
# Also log to application logger
if status == 'SUCCESS':
logger.info(message)
else:
logger.error(f"{message} - Error: {error_message}")
except Exception as e:
logger.error(f"Failed to log database operation: {str(e)}")
def log_api_request(endpoint, method, user_id=None, status_code=200, response_time=None,
error_message=None, extra_data=None):
"""
Log API requests
Args:
endpoint: API endpoint
method: HTTP method
user_id: User ID making request
status_code: HTTP status code
response_time: Response time in milliseconds
error_message: Error message if failed
extra_data: Additional data to log
"""
try:
message = f"API Request: {method} {endpoint} - Status: {status_code}"
if response_time:
message += f" - Time: {response_time}ms"
extra_info = {
'endpoint': endpoint,
'method': method,
'status_code': status_code,
'response_time': response_time,
'timestamp': datetime.utcnow().isoformat()
}
if request:
extra_info.update({
'ip_address': request.environ.get('HTTP_X_REAL_IP', request.remote_addr),
'user_agent': request.headers.get('User-Agent', ''),
'content_length': request.content_length
})
if error_message:
extra_info['error_message'] = error_message
if extra_data:
extra_info.update(extra_data)
# Determine log level based on status code
if status_code < 400:
log_level = 'INFO'
elif status_code < 500:
log_level = 'WARNING'
else:
log_level = 'ERROR'
# Log to database
SystemLog.log(
level=log_level,
message=message,
module='api',
user_id=user_id,
ip_address=request.environ.get('HTTP_X_REAL_IP', request.remote_addr) if request else None,
user_agent=request.headers.get('User-Agent', '') if request else None,
extra_data=extra_info
)
# Also log to application logger
if status_code < 400:
logger.info(message)
elif status_code < 500:
logger.warning(message)
else:
logger.error(f"{message} - Error: {error_message}")
except Exception as e:
logger.error(f"Failed to log API request: {str(e)}")
def log_daemon_event(event_type, message, status='INFO', error_message=None, extra_data=None):
"""
Log daemon-related events
Args:
event_type: Type of daemon event
message: Log message
status: Log level (INFO, WARNING, ERROR)
error_message: Error message if applicable
extra_data: Additional data to log
"""
try:
full_message = f"Daemon Event: {event_type} - {message}"
extra_info = {
'event_type': event_type,
'timestamp': datetime.utcnow().isoformat()
}
if error_message:
extra_info['error_message'] = error_message
if extra_data:
extra_info.update(extra_data)
# Log to database
SystemLog.log(
level=status,
message=full_message,
module='daemon',
extra_data=extra_info
)
# Also log to application logger
if status == 'INFO':
logger.info(full_message)
elif status == 'WARNING':
logger.warning(full_message)
else:
logger.error(f"{full_message} - Error: {error_message}")
except Exception as e:
logger.error(f"Failed to log daemon event: {str(e)}")
def log_upload_progress(upload_id, progress, status, user_id=None, match_id=None,
error_message=None, extra_data=None):
"""
Log upload progress events
Args:
upload_id: Upload ID
progress: Progress percentage
status: Upload status
user_id: User ID performing upload
match_id: Associated match ID
error_message: Error message if failed
extra_data: Additional data to log
"""
try:
message = f"Upload Progress: ID {upload_id} - {progress}% - Status: {status}"
extra_info = {
'upload_id': upload_id,
'progress': progress,
'status': status,
'timestamp': datetime.utcnow().isoformat()
}
if error_message:
extra_info['error_message'] = error_message
if extra_data:
extra_info.update(extra_data)
# Determine log level based on status
if status in ['completed', 'uploading']:
log_level = 'INFO'
elif status == 'failed':
log_level = 'ERROR'
else:
log_level = 'WARNING'
# Log to database
SystemLog.log(
level=log_level,
message=message,
module='upload',
user_id=user_id,
match_id=match_id,
upload_id=upload_id,
extra_data=extra_info
)
# Also log to application logger (only for significant events to avoid spam)
if progress % 25 == 0 or status in ['completed', 'failed']:
if log_level == 'INFO':
logger.info(message)
elif log_level == 'WARNING':
logger.warning(message)
else:
logger.error(f"{message} - Error: {error_message}")
except Exception as e:
logger.error(f"Failed to log upload progress: {str(e)}")
class RequestLogger:
"""Middleware for logging HTTP requests"""
def __init__(self, app=None):
self.app = app
if app is not None:
self.init_app(app)
def init_app(self, app):
"""Initialize request logger with Flask app"""
app.before_request(self.before_request)
app.after_request(self.after_request)
def before_request(self):
"""Log request start"""
request.start_time = datetime.utcnow()
def after_request(self, response):
"""Log request completion"""
try:
if hasattr(request, 'start_time'):
response_time = (datetime.utcnow() - request.start_time).total_seconds() * 1000
# Skip logging for static files and health checks
if not (request.endpoint and
(request.endpoint.startswith('static') or
request.endpoint == 'health_check')):
log_api_request(
endpoint=request.endpoint or request.path,
method=request.method,
user_id=getattr(request, 'user_id', None),
status_code=response.status_code,
response_time=round(response_time, 2)
)
except Exception as e:
logger.error(f"Request logging error: {str(e)}")
return response
def setup_logging(app):
"""Setup comprehensive logging for the application"""
# Initialize request logger
request_logger = RequestLogger(app)
# Configure application logger
if not app.debug:
# Production logging
import logging.handlers
file_handler = logging.handlers.RotatingFileHandler(
app.config.get('DAEMON_LOG_FILE', 'fixture-daemon.log'),
maxBytes=10240000, # 10MB
backupCount=10
)
file_handler.setFormatter(logging.Formatter(
'%(asctime)s %(levelname)s: %(message)s [in %(pathname)s:%(lineno)d]'
))
file_handler.setLevel(logging.INFO)
app.logger.addHandler(file_handler)
app.logger.setLevel(logging.INFO)
# Log application startup
app.logger.info('Fixture Daemon startup')
return request_logger
\ No newline at end of file
import re
import secrets
import hashlib
import time
from datetime import datetime, timedelta
from functools import wraps
from flask import request, jsonify, current_app
from flask_login import current_user
import redis
import logging
logger = logging.getLogger(__name__)
# Rate limiting storage (in-memory fallback if Redis not available)
rate_limit_storage = {}
def get_redis_client():
"""Get Redis client for rate limiting (optional)"""
try:
import redis
return redis.Redis(
host=current_app.config.get('REDIS_HOST', 'localhost'),
port=current_app.config.get('REDIS_PORT', 6379),
db=current_app.config.get('REDIS_DB', 0),
decode_responses=True
)
except (ImportError, redis.ConnectionError):
return None
def rate_limit_check(identifier, action, max_attempts=5, window_minutes=15):
"""
Check if an action is rate limited
Args:
identifier: IP address or user ID
action: Action type (login, register, etc.)
max_attempts: Maximum attempts allowed
window_minutes: Time window in minutes
Returns:
bool: True if action is allowed, False if rate limited
"""
try:
redis_client = get_redis_client()
key = f"rate_limit:{action}:{identifier}"
if redis_client:
# Use Redis for distributed rate limiting
current_time = int(time.time())
window_start = current_time - (window_minutes * 60)
# Remove old entries
redis_client.zremrangebyscore(key, 0, window_start)
# Count current attempts
current_attempts = redis_client.zcard(key)
if current_attempts >= max_attempts:
return False
# Add current attempt
redis_client.zadd(key, {str(current_time): current_time})
redis_client.expire(key, window_minutes * 60)
return True
else:
# Fallback to in-memory storage
current_time = datetime.utcnow()
window_start = current_time - timedelta(minutes=window_minutes)
if key not in rate_limit_storage:
rate_limit_storage[key] = []
# Remove old entries
rate_limit_storage[key] = [
timestamp for timestamp in rate_limit_storage[key]
if timestamp > window_start
]
if len(rate_limit_storage[key]) >= max_attempts:
return False
# Add current attempt
rate_limit_storage[key].append(current_time)
return True
except Exception as e:
logger.error(f"Rate limit check error: {str(e)}")
# Allow action if rate limiting fails
return True
def validate_password_strength(password):
"""
Validate password strength
Requirements:
- At least 8 characters long
- Contains uppercase letter
- Contains lowercase letter
- Contains digit
- Contains special character
- Not in common weak passwords list
Args:
password: Password to validate
Returns:
bool: True if password meets requirements
"""
if len(password) < 8:
return False
# Check for required character types
if not re.search(r'[A-Z]', password):
return False
if not re.search(r'[a-z]', password):
return False
if not re.search(r'\d', password):
return False
if not re.search(r'[!@#$%^&*(),.?":{}|<>]', password):
return False
# Check against common weak passwords
weak_passwords = {
'password', '12345678', 'qwerty123', 'admin123',
'password123', '123456789', 'welcome123', 'letmein123',
'monkey123', 'dragon123', 'master123', 'shadow123'
}
if password.lower() in weak_passwords:
return False
return True
def generate_secure_token(length=32):
"""
Generate cryptographically secure random token
Args:
length: Token length in bytes
Returns:
str: Hex-encoded secure token
"""
return secrets.token_hex(length)
def generate_csrf_token():
"""Generate CSRF token"""
return generate_secure_token(16)
def validate_csrf_token(token, session_token):
"""Validate CSRF token"""
return secrets.compare_digest(token, session_token)
def hash_file_content(file_path):
"""
Calculate SHA1 hash of file content
Args:
file_path: Path to file
Returns:
str: SHA1 hash in hexadecimal
"""
sha1_hash = hashlib.sha1()
try:
with open(file_path, 'rb') as f:
for chunk in iter(lambda: f.read(4096), b""):
sha1_hash.update(chunk)
return sha1_hash.hexdigest()
except Exception as e:
logger.error(f"File hash calculation error: {str(e)}")
return None
def validate_file_type(filename, allowed_extensions):
"""
Validate file type by extension
Args:
filename: Name of the file
allowed_extensions: Set of allowed extensions
Returns:
bool: True if file type is allowed
"""
if not filename:
return False
extension = filename.rsplit('.', 1)[-1].lower()
return extension in allowed_extensions
def sanitize_filename(filename):
"""
Sanitize filename for safe storage
Args:
filename: Original filename
Returns:
str: Sanitized filename
"""
# Remove path components
filename = filename.split('/')[-1].split('\\')[-1]
# Replace dangerous characters
filename = re.sub(r'[^\w\-_\.]', '_', filename)
# Limit length
if len(filename) > 255:
name, ext = filename.rsplit('.', 1) if '.' in filename else (filename, '')
max_name_length = 255 - len(ext) - 1 if ext else 255
filename = name[:max_name_length] + ('.' + ext if ext else '')
return filename
def validate_ip_address(ip_address):
"""
Validate IP address format
Args:
ip_address: IP address string
Returns:
bool: True if valid IP address
"""
import ipaddress
try:
ipaddress.ip_address(ip_address)
return True
except ValueError:
return False
def is_safe_url(target):
"""
Check if URL is safe for redirect
Args:
target: Target URL
Returns:
bool: True if URL is safe
"""
from urllib.parse import urlparse, urljoin
from flask import request
ref_url = urlparse(request.host_url)
test_url = urlparse(urljoin(request.host_url, target))
return test_url.scheme in ('http', 'https') and ref_url.netloc == test_url.netloc
def require_admin(f):
"""
Decorator to require admin privileges
Args:
f: Function to decorate
Returns:
Decorated function
"""
@wraps(f)
def decorated_function(*args, **kwargs):
if not current_user.is_authenticated or not current_user.is_admin:
return jsonify({'error': 'Admin privileges required'}), 403
return f(*args, **kwargs)
return decorated_function
def require_active_user(f):
"""
Decorator to require active user
Args:
f: Function to decorate
Returns:
Decorated function
"""
@wraps(f)
def decorated_function(*args, **kwargs):
if not current_user.is_authenticated or not current_user.is_active:
return jsonify({'error': 'Active user account required'}), 403
return f(*args, **kwargs)
return decorated_function
def validate_file_size(file_size, max_size=None):
"""
Validate file size
Args:
file_size: Size of file in bytes
max_size: Maximum allowed size (defaults to config)
Returns:
bool: True if file size is acceptable
"""
if max_size is None:
max_size = current_app.config.get('MAX_CONTENT_LENGTH', 500 * 1024 * 1024)
return file_size <= max_size
def detect_malicious_content(file_path):
"""
Basic malicious content detection
Args:
file_path: Path to file to check
Returns:
bool: True if potentially malicious content detected
"""
try:
# Check file size (extremely large files might be suspicious)
import os
file_size = os.path.getsize(file_path)
if file_size > 1024 * 1024 * 1024: # 1GB
return True
# Check for executable signatures in first few bytes
with open(file_path, 'rb') as f:
header = f.read(1024)
# Common executable signatures
malicious_signatures = [
b'MZ', # Windows PE
b'\x7fELF', # Linux ELF
b'\xfe\xed\xfa', # macOS Mach-O
b'#!/bin/sh', # Shell script
b'#!/bin/bash', # Bash script
b'<?php', # PHP script
]
for signature in malicious_signatures:
if header.startswith(signature):
return True
return False
except Exception as e:
logger.error(f"Malicious content detection error: {str(e)}")
# Err on the side of caution
return True
def generate_api_key():
"""Generate API key for external integrations"""
return f"fxd_{generate_secure_token(24)}"
def validate_api_key(api_key):
"""
Validate API key format
Args:
api_key: API key to validate
Returns:
bool: True if API key format is valid
"""
return bool(re.match(r'^fxd_[a-f0-9]{48}$', api_key))
class SecurityHeaders:
"""Security headers middleware"""
@staticmethod
def add_security_headers(response):
"""Add security headers to response"""
response.headers['X-Content-Type-Options'] = 'nosniff'
response.headers['X-Frame-Options'] = 'DENY'
response.headers['X-XSS-Protection'] = '1; mode=block'
response.headers['Strict-Transport-Security'] = 'max-age=31536000; includeSubDomains'
response.headers['Content-Security-Policy'] = (
"default-src 'self'; "
"script-src 'self' 'unsafe-inline'; "
"style-src 'self' 'unsafe-inline'; "
"img-src 'self' data:; "
"font-src 'self'; "
"connect-src 'self'"
)
return response
\ No newline at end of file
import os
from dotenv import load_dotenv
load_dotenv()
class Config:
"""Base configuration class"""
SECRET_KEY = os.environ.get('SECRET_KEY') or 'dev-secret-key-change-in-production'
# Database Configuration
MYSQL_HOST = os.environ.get('MYSQL_HOST') or 'localhost'
MYSQL_PORT = int(os.environ.get('MYSQL_PORT') or 3306)
MYSQL_USER = os.environ.get('MYSQL_USER') or 'root'
MYSQL_PASSWORD = os.environ.get('MYSQL_PASSWORD') or ''
MYSQL_DATABASE = os.environ.get('MYSQL_DATABASE') or 'fixture_manager'
SQLALCHEMY_DATABASE_URI = f"mysql+pymysql://{MYSQL_USER}:{MYSQL_PASSWORD}@{MYSQL_HOST}:{MYSQL_PORT}/{MYSQL_DATABASE}"
SQLALCHEMY_TRACK_MODIFICATIONS = False
SQLALCHEMY_ENGINE_OPTIONS = {
'pool_pre_ping': True,
'pool_recycle': 300,
'pool_timeout': 20,
'max_overflow': 0
}
# File Upload Configuration
UPLOAD_FOLDER = os.environ.get('UPLOAD_FOLDER') or '/var/lib/fixture-daemon/uploads'
MAX_CONTENT_LENGTH = int(os.environ.get('MAX_CONTENT_LENGTH') or 500 * 1024 * 1024) # 500MB
ALLOWED_FIXTURE_EXTENSIONS = {'csv', 'xlsx', 'xls'}
ALLOWED_ZIP_EXTENSIONS = {'zip'}
# Security Configuration
JWT_SECRET_KEY = os.environ.get('JWT_SECRET_KEY') or SECRET_KEY
JWT_ACCESS_TOKEN_EXPIRES = int(os.environ.get('JWT_ACCESS_TOKEN_EXPIRES') or 3600) # 1 hour
BCRYPT_LOG_ROUNDS = int(os.environ.get('BCRYPT_LOG_ROUNDS') or 12)
# Daemon Configuration
DAEMON_PID_FILE = os.environ.get('DAEMON_PID_FILE') or '/var/run/fixture-daemon.pid'
DAEMON_LOG_FILE = os.environ.get('DAEMON_LOG_FILE') or '/var/log/fixture-daemon.log'
DAEMON_WORKING_DIR = os.environ.get('DAEMON_WORKING_DIR') or '/var/lib/fixture-daemon'
# Web Server Configuration
HOST = os.environ.get('HOST') or '0.0.0.0'
PORT = int(os.environ.get('PORT') or 5000)
DEBUG = os.environ.get('DEBUG', 'False').lower() == 'true'
# Logging Configuration
LOG_LEVEL = os.environ.get('LOG_LEVEL') or 'INFO'
LOG_FORMAT = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
# File Processing Configuration
CHUNK_SIZE = int(os.environ.get('CHUNK_SIZE') or 8192) # 8KB chunks for file processing
MAX_CONCURRENT_UPLOADS = int(os.environ.get('MAX_CONCURRENT_UPLOADS') or 5)
@staticmethod
def init_app(app):
"""Initialize application with configuration"""
# Create necessary directories
os.makedirs(Config.UPLOAD_FOLDER, exist_ok=True)
os.makedirs(os.path.dirname(Config.DAEMON_PID_FILE), exist_ok=True)
os.makedirs(os.path.dirname(Config.DAEMON_LOG_FILE), exist_ok=True)
os.makedirs(Config.DAEMON_WORKING_DIR, exist_ok=True)
class DevelopmentConfig(Config):
"""Development configuration"""
DEBUG = True
SQLALCHEMY_ECHO = True
class ProductionConfig(Config):
"""Production configuration"""
DEBUG = False
SQLALCHEMY_ECHO = False
@classmethod
def init_app(cls, app):
Config.init_app(app)
# Production-specific initialization
import logging
from logging.handlers import RotatingFileHandler
if not app.debug:
file_handler = RotatingFileHandler(
cls.DAEMON_LOG_FILE,
maxBytes=10240000,
backupCount=10
)
file_handler.setFormatter(logging.Formatter(cls.LOG_FORMAT))
file_handler.setLevel(logging.INFO)
app.logger.addHandler(file_handler)
class TestingConfig(Config):
"""Testing configuration"""
TESTING = True
SQLALCHEMY_DATABASE_URI = 'sqlite:///:memory:'
WTF_CSRF_ENABLED = False
config = {
'development': DevelopmentConfig,
'production': ProductionConfig,
'testing': TestingConfig,
'default': DevelopmentConfig
}
\ No newline at end of file
#!/usr/bin/env python3
"""
Fixture Manager Daemon
A comprehensive Python daemon system for Linux servers with secure web dashboard
and RESTful API with robust authentication mechanisms.
"""
import os
import sys
import signal
import time
import logging
import argparse
import threading
from pathlib import Path
from datetime import datetime
from daemon import DaemonContext
from daemon.pidfile import TimeoutPIDLockFile
import lockfile
import psutil
from flask import Flask
from werkzeug.serving import make_server
import click
# Add the project root to Python path
project_root = Path(__file__).parent
sys.path.insert(0, str(project_root))
from app import create_app, db
from app.database import init_database_manager
from app.utils.logging import setup_logging, log_daemon_event
from config import config
class FixtureDaemon:
"""Main daemon class for the Fixture Manager system"""
def __init__(self, config_name='production'):
self.config_name = config_name
self.app = None
self.server = None
self.running = False
self.shutdown_event = threading.Event()
self.logger = None
def setup_logging(self):
"""Setup daemon logging"""
log_file = config[self.config_name].DAEMON_LOG_FILE
log_level = getattr(logging, config[self.config_name].LOG_LEVEL.upper())
# Create log directory if it doesn't exist
os.makedirs(os.path.dirname(log_file), exist_ok=True)
# Configure logging
logging.basicConfig(
level=log_level,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
handlers=[
logging.FileHandler(log_file),
logging.StreamHandler(sys.stdout)
]
)
self.logger = logging.getLogger('fixture-daemon')
self.logger.info("Daemon logging initialized")
def create_flask_app(self):
"""Create and configure Flask application"""
try:
self.app = create_app(self.config_name)
# Initialize database manager
with self.app.app_context():
db_manager = init_database_manager(self.app)
# Initialize database if needed
schema_file = project_root / 'database' / 'schema.sql'
if schema_file.exists():
success = db_manager.initialize_database(str(schema_file))
if not success:
self.logger.error("Database initialization failed")
return False
else:
self.logger.warning("Schema file not found, using SQLAlchemy create_all")
db.create_all()
self.logger.info("Flask application created successfully")
return True
except Exception as e:
self.logger.error(f"Failed to create Flask application: {str(e)}")
return False
def create_server(self):
"""Create HTTP server"""
try:
host = self.app.config['HOST']
port = self.app.config['PORT']
self.server = make_server(
host, port, self.app,
threaded=True,
request_handler=None
)
self.logger.info(f"HTTP server created on {host}:{port}")
return True
except Exception as e:
self.logger.error(f"Failed to create HTTP server: {str(e)}")
return False
def signal_handler(self, signum, frame):
"""Handle shutdown signals"""
signal_names = {
signal.SIGTERM: 'SIGTERM',
signal.SIGINT: 'SIGINT',
signal.SIGHUP: 'SIGHUP'
}
signal_name = signal_names.get(signum, f'Signal {signum}')
self.logger.info(f"Received {signal_name}, initiating shutdown...")
if signum == signal.SIGHUP:
# Reload configuration on SIGHUP
self.reload_config()
else:
# Shutdown on SIGTERM/SIGINT
self.shutdown()
def reload_config(self):
"""Reload daemon configuration"""
try:
self.logger.info("Reloading configuration...")
# In a production system, you might want to reload config here
log_daemon_event('CONFIG_RELOAD', 'Configuration reloaded successfully')
except Exception as e:
self.logger.error(f"Configuration reload failed: {str(e)}")
log_daemon_event('CONFIG_RELOAD_FAILED', f'Configuration reload failed: {str(e)}', 'ERROR')
def shutdown(self):
"""Graceful shutdown"""
self.logger.info("Initiating graceful shutdown...")
self.running = False
self.shutdown_event.set()
if self.server:
self.logger.info("Shutting down HTTP server...")
self.server.shutdown()
log_daemon_event('DAEMON_SHUTDOWN', 'Daemon shutdown completed')
def run_server(self):
"""Run the HTTP server in a separate thread"""
try:
self.logger.info("Starting HTTP server...")
self.server.serve_forever()
except Exception as e:
if self.running: # Only log if not shutting down
self.logger.error(f"HTTP server error: {str(e)}")
log_daemon_event('SERVER_ERROR', f'HTTP server error: {str(e)}', 'ERROR')
def run_maintenance_tasks(self):
"""Run periodic maintenance tasks"""
try:
with self.app.app_context():
from app.database import get_db_manager
from app.upload.file_handler import get_file_upload_handler
db_manager = get_db_manager()
file_handler = get_file_upload_handler()
# Clean up expired sessions (every hour)
if int(time.time()) % 3600 == 0:
expired_sessions = db_manager.cleanup_expired_sessions()
if expired_sessions > 0:
self.logger.info(f"Cleaned up {expired_sessions} expired sessions")
# Clean up failed uploads (every 6 hours)
if int(time.time()) % 21600 == 0:
file_handler.cleanup_failed_uploads()
self.logger.info("Cleaned up failed uploads")
# Clean up old logs (daily at midnight)
current_time = datetime.now()
if current_time.hour == 0 and current_time.minute == 0:
old_logs = db_manager.cleanup_old_logs(30) # Keep 30 days
if old_logs > 0:
self.logger.info(f"Cleaned up {old_logs} old log entries")
except Exception as e:
self.logger.error(f"Maintenance task error: {str(e)}")
def run(self):
"""Main daemon run loop"""
try:
self.setup_logging()
log_daemon_event('DAEMON_START', 'Fixture daemon starting up')
# Create Flask app
if not self.create_flask_app():
log_daemon_event('DAEMON_START_FAILED', 'Flask app creation failed', 'ERROR')
return False
# Create HTTP server
if not self.create_server():
log_daemon_event('DAEMON_START_FAILED', 'HTTP server creation failed', 'ERROR')
return False
# Setup signal handlers
signal.signal(signal.SIGTERM, self.signal_handler)
signal.signal(signal.SIGINT, self.signal_handler)
signal.signal(signal.SIGHUP, self.signal_handler)
# Start HTTP server in separate thread
server_thread = threading.Thread(target=self.run_server, daemon=True)
server_thread.start()
self.running = True
log_daemon_event('DAEMON_STARTED', f'Daemon started successfully on {self.app.config["HOST"]}:{self.app.config["PORT"]}')
# Main loop
while self.running:
try:
# Run maintenance tasks
self.run_maintenance_tasks()
# Sleep for 60 seconds or until shutdown
if self.shutdown_event.wait(timeout=60):
break
except KeyboardInterrupt:
self.logger.info("Keyboard interrupt received")
break
except Exception as e:
self.logger.error(f"Main loop error: {str(e)}")
time.sleep(5) # Brief pause before continuing
# Wait for server thread to finish
if server_thread.is_alive():
server_thread.join(timeout=10)
self.logger.info("Daemon shutdown complete")
return True
except Exception as e:
self.logger.error(f"Daemon run error: {str(e)}")
log_daemon_event('DAEMON_ERROR', f'Daemon error: {str(e)}', 'ERROR')
return False
def get_daemon_pid(pid_file):
"""Get daemon PID from PID file"""
try:
if os.path.exists(pid_file):
with open(pid_file, 'r') as f:
return int(f.read().strip())
except (ValueError, IOError):
pass
return None
def is_daemon_running(pid):
"""Check if daemon is running"""
if pid is None:
return False
try:
return psutil.pid_exists(pid)
except:
return False
def start_daemon(config_name='production', foreground=False):
"""Start the daemon"""
daemon_config = config[config_name]
pid_file = daemon_config.DAEMON_PID_FILE
working_dir = daemon_config.DAEMON_WORKING_DIR
# Check if already running
existing_pid = get_daemon_pid(pid_file)
if is_daemon_running(existing_pid):
print(f"Daemon already running with PID {existing_pid}")
return False
# Create working directory
os.makedirs(working_dir, exist_ok=True)
os.makedirs(os.path.dirname(pid_file), exist_ok=True)
daemon = FixtureDaemon(config_name)
if foreground:
# Run in foreground
print("Starting daemon in foreground mode...")
return daemon.run()
else:
# Run as daemon
print("Starting daemon in background mode...")
# Create daemon context
pidfile = TimeoutPIDLockFile(pid_file, timeout=10)
context = DaemonContext(
pidfile=pidfile,
working_directory=working_dir,
umask=0o002,
prevent_core=True,
)
try:
with context:
return daemon.run()
except lockfile.AlreadyLocked:
print("Daemon is already running")
return False
except Exception as e:
print(f"Failed to start daemon: {str(e)}")
return False
def stop_daemon(config_name='production'):
"""Stop the daemon"""
daemon_config = config[config_name]
pid_file = daemon_config.DAEMON_PID_FILE
pid = get_daemon_pid(pid_file)
if not is_daemon_running(pid):
print("Daemon is not running")
return True
try:
print(f"Stopping daemon (PID {pid})...")
os.kill(pid, signal.SIGTERM)
# Wait for daemon to stop
for _ in range(30): # Wait up to 30 seconds
if not is_daemon_running(pid):
print("Daemon stopped successfully")
# Clean up PID file
try:
os.remove(pid_file)
except OSError:
pass
return True
time.sleep(1)
# Force kill if still running
print("Daemon did not stop gracefully, forcing termination...")
os.kill(pid, signal.SIGKILL)
time.sleep(2)
if not is_daemon_running(pid):
print("Daemon forcefully terminated")
try:
os.remove(pid_file)
except OSError:
pass
return True
else:
print("Failed to stop daemon")
return False
except ProcessLookupError:
print("Daemon process not found")
try:
os.remove(pid_file)
except OSError:
pass
return True
except PermissionError:
print("Permission denied - run as root or daemon owner")
return False
except Exception as e:
print(f"Error stopping daemon: {str(e)}")
return False
def restart_daemon(config_name='production'):
"""Restart the daemon"""
print("Restarting daemon...")
stop_daemon(config_name)
time.sleep(2)
return start_daemon(config_name)
def reload_daemon(config_name='production'):
"""Reload daemon configuration"""
daemon_config = config[config_name]
pid_file = daemon_config.DAEMON_PID_FILE
pid = get_daemon_pid(pid_file)
if not is_daemon_running(pid):
print("Daemon is not running")
return False
try:
print(f"Reloading daemon configuration (PID {pid})...")
os.kill(pid, signal.SIGHUP)
print("Reload signal sent successfully")
return True
except Exception as e:
print(f"Error reloading daemon: {str(e)}")
return False
def status_daemon(config_name='production'):
"""Check daemon status"""
daemon_config = config[config_name]
pid_file = daemon_config.DAEMON_PID_FILE
pid = get_daemon_pid(pid_file)
if is_daemon_running(pid):
try:
process = psutil.Process(pid)
print(f"Daemon is running (PID {pid})")
print(f" Status: {process.status()}")
print(f" CPU: {process.cpu_percent()}%")
print(f" Memory: {process.memory_info().rss / 1024 / 1024:.1f} MB")
print(f" Started: {datetime.fromtimestamp(process.create_time())}")
return True
except psutil.NoSuchProcess:
print("Daemon PID file exists but process not found")
return False
else:
print("Daemon is not running")
return False
@click.command()
@click.argument('action', type=click.Choice(['start', 'stop', 'restart', 'reload', 'status']))
@click.option('--config', '-c', default='production', help='Configuration name')
@click.option('--foreground', '-f', is_flag=True, help='Run in foreground (for start action)')
def main(action, config, foreground):
"""Fixture Manager Daemon Control Script"""
if action == 'start':
success = start_daemon(config, foreground)
sys.exit(0 if success else 1)
elif action == 'stop':
success = stop_daemon(config)
sys.exit(0 if success else 1)
elif action == 'restart':
success = restart_daemon(config)
sys.exit(0 if success else 1)
elif action == 'reload':
success = reload_daemon(config)
sys.exit(0 if success else 1)
elif action == 'status':
success = status_daemon(config)
sys.exit(0 if success else 1)
if __name__ == '__main__':
main()
\ No newline at end of file
-- Fixture Manager Database Schema
-- MySQL DDL Script for automated database creation
-- Create database if it doesn't exist
CREATE DATABASE IF NOT EXISTS fixture_manager
CHARACTER SET utf8mb4
COLLATE utf8mb4_unicode_ci;
USE fixture_manager;
-- Users table for authentication
CREATE TABLE IF NOT EXISTS users (
id INT AUTO_INCREMENT PRIMARY KEY,
username VARCHAR(80) NOT NULL UNIQUE,
email VARCHAR(120) NOT NULL UNIQUE,
password_hash VARCHAR(255) NOT NULL,
is_active BOOLEAN DEFAULT TRUE,
is_admin BOOLEAN DEFAULT FALSE,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
last_login TIMESTAMP NULL,
INDEX idx_username (username),
INDEX idx_email (email),
INDEX idx_active (is_active)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
-- Primary matches table storing core fixture data
CREATE TABLE IF NOT EXISTS matches (
id INT AUTO_INCREMENT PRIMARY KEY,
match_number INT NOT NULL UNIQUE COMMENT 'Match # from fixture file',
fighter1_township VARCHAR(255) NOT NULL COMMENT 'Fighter1 (Township)',
fighter2_township VARCHAR(255) NOT NULL COMMENT 'Fighter2 (Township)',
venue_kampala_township VARCHAR(255) NOT NULL COMMENT 'Venue (Kampala Township)',
-- System fields
start_time DATETIME NULL COMMENT 'Match start time',
end_time DATETIME NULL COMMENT 'Match end time',
result VARCHAR(255) NULL COMMENT 'Match result/outcome',
filename VARCHAR(1024) NOT NULL COMMENT 'Original fixture filename',
file_sha1sum VARCHAR(255) NOT NULL COMMENT 'SHA1 checksum of fixture file',
fixture_id VARCHAR(255) NOT NULL UNIQUE COMMENT 'Unique fixture identifier',
active_status BOOLEAN DEFAULT FALSE COMMENT 'Active status flag',
-- ZIP file related fields
zip_filename VARCHAR(1024) NULL COMMENT 'Associated ZIP filename',
zip_sha1sum VARCHAR(255) NULL COMMENT 'SHA1 checksum of ZIP file',
zip_upload_status ENUM('pending', 'uploading', 'completed', 'failed') DEFAULT 'pending',
zip_upload_progress DECIMAL(5,2) DEFAULT 0.00 COMMENT 'Upload progress percentage',
-- Metadata
created_by INT NULL COMMENT 'User who created this record',
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
INDEX idx_match_number (match_number),
INDEX idx_fixture_id (fixture_id),
INDEX idx_active_status (active_status),
INDEX idx_file_sha1sum (file_sha1sum),
INDEX idx_zip_sha1sum (zip_sha1sum),
INDEX idx_zip_upload_status (zip_upload_status),
INDEX idx_created_by (created_by),
FOREIGN KEY (created_by) REFERENCES users(id) ON DELETE SET NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
-- Secondary outcomes table with foreign key relationships
CREATE TABLE IF NOT EXISTS match_outcomes (
id INT AUTO_INCREMENT PRIMARY KEY,
match_id INT NOT NULL COMMENT 'Foreign key to matches table',
column_name VARCHAR(255) NOT NULL COMMENT 'Result column name from fixture file',
float_value DECIMAL(10,2) NOT NULL COMMENT 'Float value with 2-decimal precision',
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
INDEX idx_match_id (match_id),
INDEX idx_column_name (column_name),
INDEX idx_float_value (float_value),
FOREIGN KEY (match_id) REFERENCES matches(id) ON DELETE CASCADE,
UNIQUE KEY unique_match_column (match_id, column_name)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
-- File uploads tracking table
CREATE TABLE IF NOT EXISTS file_uploads (
id INT AUTO_INCREMENT PRIMARY KEY,
filename VARCHAR(1024) NOT NULL,
original_filename VARCHAR(1024) NOT NULL,
file_path VARCHAR(2048) NOT NULL,
file_size BIGINT NOT NULL,
file_type ENUM('fixture', 'zip') NOT NULL,
mime_type VARCHAR(255) NOT NULL,
sha1sum VARCHAR(255) NOT NULL,
upload_status ENUM('uploading', 'completed', 'failed', 'processing') DEFAULT 'uploading',
upload_progress DECIMAL(5,2) DEFAULT 0.00,
error_message TEXT NULL,
-- Associated match (for ZIP files)
match_id INT NULL,
-- User tracking
uploaded_by INT NULL,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
updated_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
INDEX idx_filename (filename),
INDEX idx_sha1sum (sha1sum),
INDEX idx_upload_status (upload_status),
INDEX idx_file_type (file_type),
INDEX idx_match_id (match_id),
INDEX idx_uploaded_by (uploaded_by),
FOREIGN KEY (match_id) REFERENCES matches(id) ON DELETE SET NULL,
FOREIGN KEY (uploaded_by) REFERENCES users(id) ON DELETE SET NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
-- System logs table for comprehensive logging
CREATE TABLE IF NOT EXISTS system_logs (
id INT AUTO_INCREMENT PRIMARY KEY,
level ENUM('DEBUG', 'INFO', 'WARNING', 'ERROR', 'CRITICAL') NOT NULL,
message TEXT NOT NULL,
module VARCHAR(255) NULL,
function_name VARCHAR(255) NULL,
line_number INT NULL,
-- Context information
user_id INT NULL,
match_id INT NULL,
upload_id INT NULL,
session_id VARCHAR(255) NULL,
ip_address VARCHAR(45) NULL,
user_agent TEXT NULL,
-- Additional metadata
extra_data JSON NULL,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
INDEX idx_level (level),
INDEX idx_created_at (created_at),
INDEX idx_user_id (user_id),
INDEX idx_match_id (match_id),
INDEX idx_upload_id (upload_id),
INDEX idx_module (module),
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE SET NULL,
FOREIGN KEY (match_id) REFERENCES matches(id) ON DELETE SET NULL,
FOREIGN KEY (upload_id) REFERENCES file_uploads(id) ON DELETE SET NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
-- Session management table
CREATE TABLE IF NOT EXISTS user_sessions (
id INT AUTO_INCREMENT PRIMARY KEY,
session_id VARCHAR(255) NOT NULL UNIQUE,
user_id INT NOT NULL,
ip_address VARCHAR(45) NOT NULL,
user_agent TEXT NULL,
is_active BOOLEAN DEFAULT TRUE,
expires_at TIMESTAMP NOT NULL,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
last_activity TIMESTAMP DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP,
INDEX idx_session_id (session_id),
INDEX idx_user_id (user_id),
INDEX idx_expires_at (expires_at),
INDEX idx_is_active (is_active),
FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci;
-- Create default admin user (password: admin123 - CHANGE IN PRODUCTION!)
INSERT INTO users (username, email, password_hash, is_admin)
VALUES (
'admin',
'admin@fixture-daemon.local',
'$2b$12$LQv3c1yqBWVHxkd0LHAkCOYz6TtxMQJqhN8/LewdBPj3bp.Gm.F5e', -- admin123
TRUE
) ON DUPLICATE KEY UPDATE username=username;
-- Create indexes for performance optimization
CREATE INDEX idx_matches_composite ON matches(active_status, zip_upload_status, created_at);
CREATE INDEX idx_outcomes_composite ON match_outcomes(match_id, column_name);
CREATE INDEX idx_uploads_composite ON file_uploads(upload_status, file_type, created_at);
CREATE INDEX idx_logs_composite ON system_logs(level, created_at, user_id);
-- Create views for common queries
CREATE OR REPLACE VIEW active_matches AS
SELECT
m.*,
COUNT(mo.id) as outcome_count,
GROUP_CONCAT(CONCAT(mo.column_name, ':', mo.float_value) SEPARATOR ';') as outcomes
FROM matches m
LEFT JOIN match_outcomes mo ON m.id = mo.match_id
WHERE m.active_status = TRUE
GROUP BY m.id;
CREATE OR REPLACE VIEW upload_summary AS
SELECT
DATE(created_at) as upload_date,
file_type,
upload_status,
COUNT(*) as count,
SUM(file_size) as total_size,
AVG(upload_progress) as avg_progress
FROM file_uploads
GROUP BY DATE(created_at), file_type, upload_status;
-- Set up proper permissions (adjust as needed for your environment)
-- GRANT SELECT, INSERT, UPDATE, DELETE ON fixture_manager.* TO 'fixture_user'@'localhost';
-- FLUSH PRIVILEGES;
-- Display schema creation completion
SELECT 'Database schema created successfully!' as status;
\ No newline at end of file
#!/bin/bash
# Fixture Manager Installation Script
# Comprehensive installation script for Linux servers
set -e # Exit on any error
# Colors for output
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
BLUE='\033[0;34m'
NC='\033[0m' # No Color
# Configuration
PROJECT_NAME="fixture-manager"
SERVICE_NAME="fixture-daemon"
INSTALL_DIR="/opt/fixture-manager"
DATA_DIR="/var/lib/fixture-daemon"
LOG_DIR="/var/log"
CONFIG_DIR="/etc/fixture-manager"
USER="fixture"
GROUP="fixture"
# Functions
log_info() {
echo -e "${BLUE}[INFO]${NC} $1"
}
log_success() {
echo -e "${GREEN}[SUCCESS]${NC} $1"
}
log_warning() {
echo -e "${YELLOW}[WARNING]${NC} $1"
}
log_error() {
echo -e "${RED}[ERROR]${NC} $1"
}
check_root() {
if [[ $EUID -ne 0 ]]; then
log_error "This script must be run as root"
exit 1
fi
}
detect_os() {
if [[ -f /etc/os-release ]]; then
. /etc/os-release
OS=$NAME
VER=$VERSION_ID
else
log_error "Cannot detect operating system"
exit 1
fi
log_info "Detected OS: $OS $VER"
}
install_dependencies() {
log_info "Installing system dependencies..."
if [[ "$OS" == *"Ubuntu"* ]] || [[ "$OS" == *"Debian"* ]]; then
apt-get update
apt-get install -y \
python3 \
python3-pip \
python3-venv \
python3-dev \
mysql-server \
mysql-client \
libmysqlclient-dev \
nginx \
supervisor \
git \
curl \
wget \
unzip \
build-essential \
pkg-config
elif [[ "$OS" == *"CentOS"* ]] || [[ "$OS" == *"Red Hat"* ]] || [[ "$OS" == *"Rocky"* ]]; then
yum update -y
yum install -y \
python3 \
python3-pip \
python3-devel \
mysql-server \
mysql-devel \
nginx \
supervisor \
git \
curl \
wget \
unzip \
gcc \
gcc-c++ \
make \
pkgconfig
else
log_error "Unsupported operating system: $OS"
exit 1
fi
log_success "System dependencies installed"
}
create_user() {
log_info "Creating system user and group..."
# Create group if it doesn't exist
if ! getent group $GROUP > /dev/null 2>&1; then
groupadd --system $GROUP
log_success "Created group: $GROUP"
fi
# Create user if it doesn't exist
if ! getent passwd $USER > /dev/null 2>&1; then
useradd --system --gid $GROUP --home-dir $DATA_DIR --shell /bin/false $USER
log_success "Created user: $USER"
fi
}
create_directories() {
log_info "Creating directories..."
# Create main directories
mkdir -p $INSTALL_DIR
mkdir -p $DATA_DIR/{uploads,backups,logs}
mkdir -p $CONFIG_DIR
mkdir -p $LOG_DIR
# Set ownership and permissions
chown -R $USER:$GROUP $INSTALL_DIR
chown -R $USER:$GROUP $DATA_DIR
chown -R $USER:$GROUP $CONFIG_DIR
chmod 755 $INSTALL_DIR
chmod 750 $DATA_DIR
chmod 750 $CONFIG_DIR
chmod 755 $DATA_DIR/uploads
log_success "Directories created and configured"
}
install_application() {
log_info "Installing application files..."
# Copy application files
cp -r . $INSTALL_DIR/
# Create Python virtual environment
cd $INSTALL_DIR
python3 -m venv venv
source venv/bin/activate
# Upgrade pip
pip install --upgrade pip
# Install Python dependencies
pip install -r requirements.txt
# Set ownership
chown -R $USER:$GROUP $INSTALL_DIR
# Make daemon script executable
chmod +x $INSTALL_DIR/daemon.py
log_success "Application installed"
}
configure_database() {
log_info "Configuring MySQL database..."
# Start MySQL service
systemctl start mysql || systemctl start mysqld
systemctl enable mysql || systemctl enable mysqld
# Generate random password
DB_PASSWORD=$(openssl rand -base64 32)
# Create database and user
mysql -u root <<EOF
CREATE DATABASE IF NOT EXISTS fixture_manager CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci;
CREATE USER IF NOT EXISTS 'fixture_user'@'localhost' IDENTIFIED BY '$DB_PASSWORD';
GRANT ALL PRIVILEGES ON fixture_manager.* TO 'fixture_user'@'localhost';
FLUSH PRIVILEGES;
EOF
# Execute schema
mysql -u fixture_user -p$DB_PASSWORD fixture_manager < $INSTALL_DIR/database/schema.sql
# Save database credentials
cat > $CONFIG_DIR/database.conf <<EOF
MYSQL_HOST=localhost
MYSQL_PORT=3306
MYSQL_USER=fixture_user
MYSQL_PASSWORD=$DB_PASSWORD
MYSQL_DATABASE=fixture_manager
EOF
chmod 600 $CONFIG_DIR/database.conf
chown $USER:$GROUP $CONFIG_DIR/database.conf
log_success "Database configured"
}
create_config() {
log_info "Creating configuration files..."
# Generate secret keys
SECRET_KEY=$(openssl rand -base64 32)
JWT_SECRET_KEY=$(openssl rand -base64 32)
# Create main configuration
cat > $CONFIG_DIR/config.env <<EOF
# Database Configuration
MYSQL_HOST=localhost
MYSQL_PORT=3306
MYSQL_USER=fixture_user
MYSQL_PASSWORD=$(grep MYSQL_PASSWORD $CONFIG_DIR/database.conf | cut -d'=' -f2)
MYSQL_DATABASE=fixture_manager
# Security Configuration
SECRET_KEY=$SECRET_KEY
JWT_SECRET_KEY=$JWT_SECRET_KEY
BCRYPT_LOG_ROUNDS=12
# File Upload Configuration
UPLOAD_FOLDER=$DATA_DIR/uploads
MAX_CONTENT_LENGTH=524288000
CHUNK_SIZE=8192
MAX_CONCURRENT_UPLOADS=5
# Daemon Configuration
DAEMON_PID_FILE=/var/run/fixture-daemon.pid
DAEMON_LOG_FILE=$LOG_DIR/fixture-daemon.log
DAEMON_WORKING_DIR=$DATA_DIR
# Web Server Configuration
HOST=0.0.0.0
PORT=5000
DEBUG=false
# Logging Configuration
LOG_LEVEL=INFO
# JWT Configuration
JWT_ACCESS_TOKEN_EXPIRES=3600
EOF
chmod 600 $CONFIG_DIR/config.env
chown $USER:$GROUP $CONFIG_DIR/config.env
# Create symlink for application
ln -sf $CONFIG_DIR/config.env $INSTALL_DIR/.env
log_success "Configuration files created"
}
create_systemd_service() {
log_info "Creating systemd service..."
cat > /etc/systemd/system/$SERVICE_NAME.service <<EOF
[Unit]
Description=Fixture Manager Daemon
After=network.target mysql.service
Requires=mysql.service
[Service]
Type=forking
User=$USER
Group=$GROUP
WorkingDirectory=$INSTALL_DIR
Environment=PATH=$INSTALL_DIR/venv/bin
ExecStart=$INSTALL_DIR/venv/bin/python $INSTALL_DIR/daemon.py start --config production
ExecStop=$INSTALL_DIR/venv/bin/python $INSTALL_DIR/daemon.py stop --config production
ExecReload=$INSTALL_DIR/venv/bin/python $INSTALL_DIR/daemon.py reload --config production
PIDFile=/var/run/fixture-daemon.pid
Restart=always
RestartSec=10
# Security settings
NoNewPrivileges=true
PrivateTmp=true
ProtectSystem=strict
ProtectHome=true
ReadWritePaths=$DATA_DIR $LOG_DIR /var/run
[Install]
WantedBy=multi-user.target
EOF
# Reload systemd and enable service
systemctl daemon-reload
systemctl enable $SERVICE_NAME
log_success "Systemd service created"
}
configure_nginx() {
log_info "Configuring Nginx reverse proxy..."
cat > /etc/nginx/sites-available/$PROJECT_NAME <<EOF
server {
listen 80;
server_name _;
# Security headers
add_header X-Frame-Options DENY;
add_header X-Content-Type-Options nosniff;
add_header X-XSS-Protection "1; mode=block";
add_header Strict-Transport-Security "max-age=31536000; includeSubDomains";
# File upload size limit
client_max_body_size 500M;
location / {
proxy_pass http://127.0.0.1:5000;
proxy_set_header Host \$host;
proxy_set_header X-Real-IP \$remote_addr;
proxy_set_header X-Forwarded-For \$proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto \$scheme;
# Timeout settings for large file uploads
proxy_connect_timeout 300s;
proxy_send_timeout 300s;
proxy_read_timeout 300s;
}
# Static files (if any)
location /static {
alias $INSTALL_DIR/app/static;
expires 1y;
add_header Cache-Control "public, immutable";
}
# Health check endpoint
location /health {
proxy_pass http://127.0.0.1:5000/health;
access_log off;
}
}
EOF
# Enable site
ln -sf /etc/nginx/sites-available/$PROJECT_NAME /etc/nginx/sites-enabled/
# Remove default site
rm -f /etc/nginx/sites-enabled/default
# Test and reload nginx
nginx -t
systemctl enable nginx
systemctl restart nginx
log_success "Nginx configured"
}
setup_logrotate() {
log_info "Setting up log rotation..."
cat > /etc/logrotate.d/$SERVICE_NAME <<EOF
$LOG_DIR/fixture-daemon.log {
daily
missingok
rotate 30
compress
delaycompress
notifempty
create 644 $USER $GROUP
postrotate
systemctl reload $SERVICE_NAME > /dev/null 2>&1 || true
endscript
}
EOF
log_success "Log rotation configured"
}
setup_firewall() {
log_info "Configuring firewall..."
if command -v ufw &> /dev/null; then
# Ubuntu/Debian UFW
ufw allow 22/tcp
ufw allow 80/tcp
ufw allow 443/tcp
ufw --force enable
log_success "UFW firewall configured"
elif command -v firewall-cmd &> /dev/null; then
# CentOS/RHEL firewalld
firewall-cmd --permanent --add-service=ssh
firewall-cmd --permanent --add-service=http
firewall-cmd --permanent --add-service=https
firewall-cmd --reload
log_success "Firewalld configured"
else
log_warning "No firewall detected. Please configure manually."
fi
}
create_backup_script() {
log_info "Creating backup script..."
cat > $INSTALL_DIR/backup.sh <<'EOF'
#!/bin/bash
# Fixture Manager Backup Script
BACKUP_DIR="/var/lib/fixture-daemon/backups"
DATE=$(date +%Y%m%d_%H%M%S)
BACKUP_FILE="fixture_manager_backup_$DATE.tar.gz"
# Load database configuration
source /etc/fixture-manager/database.conf
# Create backup directory
mkdir -p $BACKUP_DIR
# Backup database
mysqldump -u $MYSQL_USER -p$MYSQL_PASSWORD $MYSQL_DATABASE > $BACKUP_DIR/database_$DATE.sql
# Backup uploads
tar -czf $BACKUP_DIR/$BACKUP_FILE \
--exclude='*.log' \
--exclude='backups' \
/var/lib/fixture-daemon/uploads \
/etc/fixture-manager \
$BACKUP_DIR/database_$DATE.sql
# Remove temporary database dump
rm $BACKUP_DIR/database_$DATE.sql
# Keep only last 7 backups
find $BACKUP_DIR -name "fixture_manager_backup_*.tar.gz" -mtime +7 -delete
echo "Backup completed: $BACKUP_DIR/$BACKUP_FILE"
EOF
chmod +x $INSTALL_DIR/backup.sh
chown $USER:$GROUP $INSTALL_DIR/backup.sh
# Add to crontab for daily backups
(crontab -u $USER -l 2>/dev/null; echo "0 2 * * * $INSTALL_DIR/backup.sh") | crontab -u $USER -
log_success "Backup script created and scheduled"
}
start_services() {
log_info "Starting services..."
# Start and enable services
systemctl start $SERVICE_NAME
systemctl status $SERVICE_NAME --no-pager
log_success "Services started"
}
print_summary() {
log_success "Installation completed successfully!"
echo
echo "=== Installation Summary ==="
echo "Application Directory: $INSTALL_DIR"
echo "Data Directory: $DATA_DIR"
echo "Configuration Directory: $CONFIG_DIR"
echo "Log File: $LOG_DIR/fixture-daemon.log"
echo "Service Name: $SERVICE_NAME"
echo "User/Group: $USER:$GROUP"
echo
echo "=== Service Management ==="
echo "Start service: systemctl start $SERVICE_NAME"
echo "Stop service: systemctl stop $SERVICE_NAME"
echo "Restart service: systemctl restart $SERVICE_NAME"
echo "View logs: journalctl -u $SERVICE_NAME -f"
echo "View app logs: tail -f $LOG_DIR/fixture-daemon.log"
echo
echo "=== Web Interface ==="
echo "URL: http://$(hostname -I | awk '{print $1}')"
echo "Default admin credentials:"
echo " Username: admin"
echo " Password: admin123"
echo
log_warning "IMPORTANT: Change the default admin password immediately!"
echo
echo "=== Configuration Files ==="
echo "Main config: $CONFIG_DIR/config.env"
echo "Database config: $CONFIG_DIR/database.conf"
echo "Nginx config: /etc/nginx/sites-available/$PROJECT_NAME"
echo
echo "=== Backup ==="
echo "Backup script: $INSTALL_DIR/backup.sh"
echo "Backup directory: $DATA_DIR/backups"
echo "Automatic daily backups at 2:00 AM"
}
# Main installation process
main() {
log_info "Starting Fixture Manager installation..."
check_root
detect_os
install_dependencies
create_user
create_directories
install_application
configure_database
create_config
create_systemd_service
configure_nginx
setup_logrotate
setup_firewall
create_backup_script
start_services
print_summary
}
# Run installation
main "$@"
\ No newline at end of file
create a new python project composed by 3 main interacting programs, start to write the first one, a daemon, to be run on a linux server exposed to internet implementing a web dashboard and a RestAPI with authentication, connect to a mysql database. The dashboard offer a csv file or an xlsx file upload and parsing, the upload form will call this file "Fixture", where the fixture data contain the following columns: "Match #","Fighter1 (Township)","Fighter2 (Township)","Venue (Kampala Township)" and a dynamic list of other optional colums that from now on i will refer as "outcome results". the parsing will populate a mysql database, connection data is fetched from a configuration file, with a table where "Match #" is an integer, all the other mandatory data is a varchar255, and all the optional columns will be stored in a separate table with an id reference to the first table row id (autoincrement), a column "result" as varchar255 and containing the file upload optional column name, and a column "value" containing a float number with 2 digits precision. The first table will contain also the following columns that are NULL by default: "start time" ad datetime, "end time" as datetime, "result" as varchar255, and filename as varchar1024, file sha1sum as varchar255, and a fixture id unique for every fixture upload. When uploaded a fixture file, the dashboard should ask to upload a zip file ( large file upload here with progress bar ) for every row inserted in the database from the fixture file, when upload is done calculate the sha1sum of the zip file, and update the database with the file name and sha1sum. Only when the zip file is uploaded and the database update with sha1sum, then the record in the database is "active", you can use a flag in an additional column for it. Also create the SQL script to create the tables of the database.
Flask==2.3.3
Flask-SQLAlchemy==3.0.5
Flask-Login==0.6.3
Flask-WTF==1.1.1
Flask-JWT-Extended==4.5.3
PyMySQL==1.1.0
cryptography==41.0.4
pandas==2.1.1
openpyxl==3.1.2
xlrd==2.0.1
python-daemon==3.0.1
lockfile==0.12.2
bcrypt==4.0.1
Werkzeug==2.3.7
WTForms==3.0.1
python-dotenv==1.0.0
gunicorn==21.2.0
psutil==5.9.5
watchdog==3.0.0
click==8.1.7
colorlog==6.7.0
marshmallow==3.20.1
\ No newline at end of file
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment