Add /api/updates endpoint with fixture tracking and hybrid authentication (v1.2.0)

## New Features
- **New /api/updates endpoint**: Incremental fixture synchronization with optional timestamp filtering
  - Supports both GET (query params) and POST (JSON body) methods
  - Optional 'from' parameter for unix timestamp-based filtering
  - Returns fixtures updated after specified timestamp or last N fixtures
  - Includes ZIP download URLs in API response
  - Configurable default count via system settings

- **Fixture active time tracking**: Automatic timestamp management
  - Added fixture_active_time column (BigInteger, indexed) to matches table
  - Automatic timestamp setting when all matches in fixture become active
  - Preserves original activation time (no overwrites)

- **Hybrid authentication system**: Maximum compatibility
  - Supports JWT tokens (short-lived, from login) AND API tokens (long-lived, from web interface)
  - Tries JWT first, falls back to API tokens automatically
  - Multiple token header formats: Authorization Bearer, X-API-Token, query parameter

- **SHA1-based ZIP file naming**: Consistent file naming
  - Uses SHA1 hash of (unix_timestamp + original_filename)
  - Applied across all upload methods for consistency

## Technical Implementation
- **Database Migration_005**: Added fixture_active_time column with proper indexing
- **System settings integration**: New api_updates_default_count setting (default: 10)
- **Backfill utility**: backfill_fixture_times.py for existing fixture data migration
- **Fallback mechanisms**: Graceful degradation for existing data without timestamps
- **Query optimization**: Proper limits and ordering for both default and filtered queries

## Bug Fixes
- Fixed logging import errors in models.py (get_logger function)
- Python 2.7 compatibility fixes in migration scripts
- Type conversion issues in SystemSettings.get_setting()

## Documentation
- Created comprehensive CHANGELOG.md
- Updated README.md with new endpoint documentation and examples
- Added API usage examples for both GET and POST methods
- Updated version to 1.2.0 with feature highlights

## Files Modified
- app/api/routes.py: New /api/updates endpoint with hybrid auth
- app/models.py: Added fixture tracking methods and backfill utility
- app/database/migrations.py: Migration_005 for schema changes
- backfill_fixture_times.py: Data migration utility for existing fixtures
- README.md: Updated documentation and examples
- CHANGELOG.md: Comprehensive change log following keep-a-changelog format
parent 70bc0cf9
# Changelog
All notable changes to the Fixture Manager daemon project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [1.2.0] - 2025-08-21
### Added
- **New `/api/updates` endpoint** for incremental fixture data synchronization
- Supports both GET and POST HTTP methods
- Optional `from` parameter for timestamp-based filtering
- Returns fixtures updated after specified unix timestamp
- Configurable default count when no `from` parameter provided
- Includes ZIP download URLs in API response
- **Fixture active time tracking** with automatic timestamp management
- New `fixture_active_time` column in matches table (BigInteger, indexed)
- Automatic timestamp setting when all matches in fixture become active
- Preserves original activation time (no overwrites)
- **Hybrid authentication system** for maximum compatibility
- Supports JWT tokens (short-lived, from login)
- Supports API tokens (long-lived, from web interface)
- Tries JWT first, falls back to API tokens automatically
- Multiple token header formats supported
- **System settings integration** for configurable behavior
- New `api_updates_default_count` setting (default: 10)
- Configurable via web interface
- Proper type conversion and fallback handling
- **SHA1-based ZIP file naming** for consistency
- Uses SHA1 hash of (unix_timestamp + original_filename)
- Applied across all upload methods (regular and streaming)
- Maintains compatibility with existing download infrastructure
- **Data backfill utility** for existing fixtures
- `backfill_fixture_times.py` script for one-time data migration
- Automatically sets fixture_active_time for existing active fixtures
- Comprehensive logging and progress reporting
- **Fallback mechanisms** for robustness
- Automatic fallback to active_status queries when fixture_active_time is null
- Graceful degradation for existing data without timestamps
- Error handling and logging for all edge cases
### Enhanced
- **Database migration system** with new Migration_005
- Added fixture_active_time column with proper indexing
- Python 2.7 compatibility fixes (no f-strings)
- Custom migration runner script for easier execution
- **API authentication flexibility**
- Authorization header (`Bearer token`)
- X-API-Token header
- Query parameter support (less secure, but available)
- **Query optimization and limits**
- Both default and filtered queries respect configurable limits
- Proper ordering (ascending for 'from' queries, descending for default)
- Efficient database queries with appropriate indexes
- **Logging improvements**
- Fixed import errors in models.py
- Consistent logging across all new features
- Proper error handling and debug information
### Fixed
- **Import errors** in models.py (get_logger function)
- **Python 2.7 compatibility** in migration scripts
- **Type conversion issues** in SystemSettings.get_setting()
- **Query performance** with proper indexing and limits
### Technical Details
- **Database Schema**: Added fixture_active_time column (BigInteger, nullable, indexed)
- **API Endpoints**: New `/api/updates` with comprehensive functionality
- **Authentication**: Dual-mode JWT/API token support with automatic fallback
- **File Operations**: SHA1-based naming for all ZIP uploads
- **Configuration**: New system setting for API behavior customization
- **Migration**: Version 005 for schema updates with proper rollback support
### Security
- All authentication methods maintain existing security standards
- API token validation and JWT verification unchanged
- Proper input validation for all new parameters
- Rate limiting and access controls remain in effect
---
## [1.1.0] - 2025-08-18
### Added
-**API Token Management**: Complete user-generated token system
-**Enhanced Security**: SHA256 token hashing with usage tracking
-**Web Interface**: Professional token management UI
-**Multiple Auth Methods**: Bearer tokens, headers, and query parameters
-**Token Lifecycle**: Create, revoke, extend, and delete operations
-**Usage Monitoring**: Last used timestamps and IP tracking
-**Database Migration**: Automatic schema updates with versioning
-**REST API Endpoints**: Protected fixture and match data access
-**Documentation**: Comprehensive API and security guidelines
---
## [1.0.0] - 2025-08-01
### Added
- Initial release of Fixture Manager daemon
- Secure web dashboard with authentication
- RESTful API with JWT authentication
- MySQL database integration
- Advanced file upload system with progress tracking
- Dual-format support (CSV/XLSX)
- Two-stage upload workflow
- Daemon process management with systemd integration
- Security features and validation
- Comprehensive documentation
\ No newline at end of file
...@@ -246,6 +246,39 @@ curl -X GET "http://your-server/api/match/123" \ ...@@ -246,6 +246,39 @@ curl -X GET "http://your-server/api/match/123" \
-H "Authorization: Bearer YOUR_API_TOKEN" -H "Authorization: Bearer YOUR_API_TOKEN"
``` ```
#### Get Fixture Updates (New!)
The `/api/updates` endpoint provides incremental synchronization for fixture data:
```bash
# Get last N fixtures (default behavior, N configured in system settings)
curl -X GET "http://your-server/api/updates" \
-H "Authorization: Bearer YOUR_API_TOKEN"
# Get fixtures updated after specific unix timestamp
curl -X GET "http://your-server/api/updates?from=1704067200" \
-H "Authorization: Bearer YOUR_API_TOKEN"
# POST method also supported with JSON body
curl -X POST "http://your-server/api/updates" \
-H "Authorization: Bearer YOUR_API_TOKEN" \
-H "Content-Type: application/json" \
-d '{"from": 1704067200}'
# Get recent fixtures without timestamp filter
curl -X POST "http://your-server/api/updates" \
-H "Authorization: Bearer YOUR_API_TOKEN" \
-H "Content-Type: application/json" \
-d '{}'
```
**Features:**
- **Incremental Updates**: Use `from` parameter for efficient data synchronization
- **Flexible Methods**: Supports both GET (query params) and POST (JSON body)
- **Configurable Limits**: Respects system setting for maximum fixtures returned
- **ZIP Downloads**: Includes direct download URLs for completed match files
- **Hybrid Authentication**: Works with both JWT and API tokens automatically
- **Smart Fallback**: Gracefully handles existing data without active timestamps
## File Format Requirements ## File Format Requirements
### Fixture Files (CSV/XLSX) ### Fixture Files (CSV/XLSX)
...@@ -458,6 +491,7 @@ curl -X DELETE http://your-server/profile/tokens/123/delete \ ...@@ -458,6 +491,7 @@ curl -X DELETE http://your-server/profile/tokens/123/delete \
- `GET /api/fixtures` - List all fixtures with match counts - `GET /api/fixtures` - List all fixtures with match counts
- `GET /api/matches` - List matches with pagination and filtering - `GET /api/matches` - List matches with pagination and filtering
- `GET /api/match/{id}` - Get match details with outcomes - `GET /api/match/{id}` - Get match details with outcomes
- `GET|POST /api/updates` - **New!** Get fixture updates with incremental sync support
### Upload Endpoints ### Upload Endpoints
- `POST /upload/api/fixture` - Upload fixture file - `POST /upload/api/fixture` - Upload fixture file
...@@ -643,11 +677,22 @@ curl -H "Authorization: Bearer $API_TOKEN" \ ...@@ -643,11 +677,22 @@ curl -H "Authorization: Bearer $API_TOKEN" \
--- ---
**Version**: 1.1.0 **Version**: 1.2.0
**Last Updated**: 2025-08-18 **Last Updated**: 2025-08-21
**Minimum Requirements**: Python 3.8+, MySQL 5.7+, Linux Kernel 3.10+ **Minimum Requirements**: Python 3.8+, MySQL 5.7+, Linux Kernel 3.10+
### Recent Updates (v1.1.0) ### Recent Updates (v1.2.0)
-**New `/api/updates` Endpoint**: Incremental fixture synchronization with timestamp-based filtering
-**Hybrid Authentication**: JWT and API token support with automatic fallback
-**Fixture Active Time Tracking**: Automatic timestamp management for fixture activation
-**SHA1-based ZIP Naming**: Consistent file naming across all upload methods
-**Configurable API Limits**: System setting for controlling API response sizes
-**Data Backfill Utility**: Migration tool for existing fixture data
-**Enhanced Database Schema**: New indexed columns and optimized queries
-**Flexible HTTP Methods**: Both GET and POST support for API endpoints
-**Fallback Mechanisms**: Graceful degradation for legacy data compatibility
### Previous Updates (v1.1.0)
-**API Token Management**: Complete user-generated token system -**API Token Management**: Complete user-generated token system
-**Enhanced Security**: SHA256 token hashing with usage tracking -**Enhanced Security**: SHA256 token hashing with usage tracking
-**Web Interface**: Professional token management UI -**Web Interface**: Professional token management UI
......
...@@ -64,14 +64,14 @@ def create_app(config_name=None): ...@@ -64,14 +64,14 @@ def create_app(config_name=None):
migration_result = run_migrations() migration_result = run_migrations()
if migration_result['status'] == 'success': if migration_result['status'] == 'success':
app.logger.info(f"Database migrations completed: {migration_result['message']}") app.logger.info("Database migrations completed: {}".format(migration_result['message']))
elif migration_result['status'] == 'partial': elif migration_result['status'] == 'partial':
app.logger.warning(f"Database migrations partially completed: {migration_result['message']}") app.logger.warning("Database migrations partially completed: {}".format(migration_result['message']))
elif migration_result['status'] == 'error': elif migration_result['status'] == 'error':
app.logger.error(f"Database migrations failed: {migration_result['message']}") app.logger.error("Database migrations failed: {}".format(migration_result['message']))
except Exception as e: except Exception as e:
app.logger.warning(f"Database initialization failed: {str(e)}") app.logger.warning("Database initialization failed: {}".format(str(e)))
app.logger.warning("Database may not exist or be accessible. Please check database configuration.") app.logger.warning("Database may not exist or be accessible. Please check database configuration.")
return app return app
......
...@@ -549,3 +549,164 @@ def api_method_not_allowed(error): ...@@ -549,3 +549,164 @@ def api_method_not_allowed(error):
@bp.errorhandler(500) @bp.errorhandler(500)
def api_internal_error(error): def api_internal_error(error):
return jsonify({'error': 'Internal server error'}), 500 return jsonify({'error': 'Internal server error'}), 500
@bp.route('/updates', methods=['GET', 'POST'])
def api_get_updates():
"""Get fixtures with updates since a specific timestamp - supports both JWT and API tokens"""
try:
from app.models import Match, User
from flask_jwt_extended import jwt_required, get_jwt_identity, verify_jwt_in_request
from app.auth.jwt_utils import validate_api_token, extract_token_from_request
user = None
auth_method = None
# Try JWT authentication first (short-lived session tokens)
try:
verify_jwt_in_request()
user_id = get_jwt_identity()
user = User.query.get(user_id)
auth_method = "JWT"
logger.info(f"API access via JWT token by user {user.username if user else 'unknown'}")
except Exception as jwt_error:
logger.debug(f"JWT authentication failed: {str(jwt_error)}")
# If JWT fails, try API token authentication (long-lived tokens)
try:
token = extract_token_from_request()
if not token:
return jsonify({
'error': 'Authentication required',
'message': 'Either JWT token or API token required'
}), 401
user, api_token = validate_api_token(token)
auth_method = f"API Token ({api_token.name if api_token else 'JWT'})"
logger.info(f"API access via API token by user {user.username}")
except Exception as api_error:
logger.debug(f"API token authentication failed: {str(api_error)}")
return jsonify({
'error': 'Authentication failed',
'message': 'Invalid JWT token or API token'
}), 401
if not user or not user.is_active:
return jsonify({'error': 'User not found or inactive'}), 404
logger.info(f"API updates accessed via {auth_method} by user {user.username} (ID: {user.id})")
# Get 'from' parameter (unix timestamp) - now optional
# Support both GET (query params) and POST (JSON body)
if request.method == 'GET':
from_timestamp = request.args.get('from')
else: # POST
data = request.get_json() or {}
from_timestamp = data.get('from')
# Get the default fixtures count for both cases
from app.models import SystemSettings
default_fixtures_count = SystemSettings.get_setting('api_updates_default_count', 10)
if from_timestamp is not None:
# If 'from' parameter is provided, validate it and get fixtures after that timestamp
try:
from_timestamp = int(from_timestamp)
except (ValueError, TypeError):
return jsonify({'error': 'Parameter "from" must be a valid unix timestamp'}), 400
# Get fixtures with active time after the given timestamp (limited to max count)
fixtures_data = Match.get_fixtures_with_active_time(from_timestamp, limit=default_fixtures_count)
else:
# If 'from' parameter is not provided, get the last N activated fixtures
fixtures_data = Match.get_last_activated_fixtures(default_fixtures_count)
# Fallback: If no fixtures have fixture_active_time set, get active fixtures by active_status
if not fixtures_data:
logger.warning("No fixtures with fixture_active_time found, falling back to active_status")
from sqlalchemy import func
fallback_query = db.session.query(
Match.fixture_id,
Match.fixture_active_time,
Match.filename,
func.min(Match.created_at).label('created_at')
).filter(
Match.active_status == True
).group_by(
Match.fixture_id,
Match.fixture_active_time,
Match.filename
)
if from_timestamp is not None:
# For 'from' queries, order by created_at ascending and limit
fallback_query = fallback_query.order_by(func.min(Match.created_at).asc())
else:
# For default queries, get most recent fixtures
fallback_query = fallback_query.order_by(func.min(Match.created_at).desc())
fixtures_data = fallback_query.limit(default_fixtures_count).all()
if not fixtures_data:
return jsonify({
'fixtures': [],
'count': 0,
'from_timestamp': from_timestamp
}), 200
result_fixtures = []
for fixture_row in fixtures_data:
fixture_id = fixture_row.fixture_id
fixture_active_time = fixture_row.fixture_active_time
filename = fixture_row.filename
created_at = fixture_row.created_at
# Get all matches for this fixture
matches = Match.query.filter_by(fixture_id=fixture_id).all()
# Prepare match data with zip download links
matches_data = []
for match in matches:
match_dict = match.to_dict(include_outcomes=True)
# Add zip file download link if available
if match.zip_filename and match.zip_upload_status == 'completed':
from flask import url_for, current_app
import os
# Construct download link
# Assuming zip files are stored in uploads directory
zip_path = os.path.join(current_app.config.get('UPLOAD_FOLDER', 'uploads'), match.zip_filename)
if os.path.exists(zip_path):
match_dict['zip_download_url'] = url_for('main.download_zip',
match_id=match.id,
_external=True)
else:
match_dict['zip_download_url'] = None
else:
match_dict['zip_download_url'] = None
matches_data.append(match_dict)
# Create fixture summary
fixture_summary = {
'fixture_id': fixture_id,
'filename': filename,
'fixture_active_time': fixture_active_time,
'created_at': created_at.isoformat() if created_at else None,
'matches_count': len(matches_data),
'matches': matches_data
}
result_fixtures.append(fixture_summary)
return jsonify({
'fixtures': result_fixtures,
'count': len(result_fixtures),
'from_timestamp': from_timestamp
}), 200
except Exception as e:
logger.error(f"API updates error: {str(e)}")
return jsonify({'error': 'Failed to retrieve updates'}), 500
\ No newline at end of file
...@@ -258,6 +258,65 @@ class Migration_004_CreateSystemSettingsTable(Migration): ...@@ -258,6 +258,65 @@ class Migration_004_CreateSystemSettingsTable(Migration):
def can_rollback(self) -> bool: def can_rollback(self) -> bool:
return True return True
class Migration_005_AddFixtureActiveTime(Migration):
"""Add fixture active time column to matches table"""
def __init__(self):
super().__init__("005", "Add fixture active time unix timestamp column to matches table")
def up(self):
"""Add fixture_active_time column"""
try:
# Check if column already exists
inspector = inspect(db.engine)
columns = [col['name'] for col in inspector.get_columns('matches')]
if 'fixture_active_time' in columns:
logger.info("fixture_active_time column already exists, skipping creation")
return True
# Add the column
alter_table_sql = '''
ALTER TABLE matches
ADD COLUMN fixture_active_time BIGINT NULL COMMENT 'Unix timestamp when all matches in fixture became active'
'''
with db.engine.connect() as conn:
conn.execute(text(alter_table_sql))
conn.commit()
# Create index for performance
create_index_sql = '''
CREATE INDEX idx_matches_fixture_active_time ON matches(fixture_active_time)
'''
with db.engine.connect() as conn:
conn.execute(text(create_index_sql))
conn.commit()
logger.info("Added fixture_active_time column and index successfully")
return True
except Exception as e:
logger.error(f"Migration 005 failed: {str(e)}")
raise
def down(self):
"""Drop fixture_active_time column"""
try:
with db.engine.connect() as conn:
conn.execute(text("DROP INDEX IF EXISTS idx_matches_fixture_active_time"))
conn.execute(text("ALTER TABLE matches DROP COLUMN IF EXISTS fixture_active_time"))
conn.commit()
logger.info("Dropped fixture_active_time column and index")
return True
except Exception as e:
logger.error(f"Rollback of migration 005 failed: {str(e)}")
raise
def can_rollback(self) -> bool:
return True
class MigrationManager: class MigrationManager:
"""Manages database migrations and versioning""" """Manages database migrations and versioning"""
...@@ -267,6 +326,7 @@ class MigrationManager: ...@@ -267,6 +326,7 @@ class MigrationManager:
Migration_001_RemoveFixtureIdUnique(), Migration_001_RemoveFixtureIdUnique(),
Migration_003_CreateAPITokensTable(), Migration_003_CreateAPITokensTable(),
Migration_004_CreateSystemSettingsTable(), Migration_004_CreateSystemSettingsTable(),
Migration_005_AddFixtureActiveTime(),
] ]
def ensure_version_table(self): def ensure_version_table(self):
......
...@@ -251,9 +251,23 @@ def delete_fixture(fixture_id): ...@@ -251,9 +251,23 @@ def delete_fixture(fixture_id):
fixture_filename = matches[0].filename fixture_filename = matches[0].filename
match_count = len(matches) match_count = len(matches)
# Delete all related data # Delete all related data in correct order to handle foreign key constraints
match_ids = [m.id for m in matches] match_ids = [m.id for m in matches]
# First, get file upload IDs that will be deleted
file_uploads = FileUpload.query.filter(FileUpload.match_id.in_(match_ids)).all()
upload_ids = [upload.id for upload in file_uploads]
# Delete system logs that reference uploads (to avoid foreign key constraint)
if upload_ids:
from app.models import SystemLog
SystemLog.query.filter(SystemLog.upload_id.in_(upload_ids)).delete(synchronize_session=False)
# Delete system logs that reference matches
if match_ids:
from app.models import SystemLog
SystemLog.query.filter(SystemLog.match_id.in_(match_ids)).delete(synchronize_session=False)
# Delete match outcomes # Delete match outcomes
MatchOutcome.query.filter(MatchOutcome.match_id.in_(match_ids)).delete(synchronize_session=False) MatchOutcome.query.filter(MatchOutcome.match_id.in_(match_ids)).delete(synchronize_session=False)
...@@ -1327,3 +1341,40 @@ def api_match_detail(match_id): ...@@ -1327,3 +1341,40 @@ def api_match_detail(match_id):
return jsonify({'error': 'Failed to retrieve match details'}), 500 return jsonify({'error': 'Failed to retrieve match details'}), 500
return _api_match_detail() return _api_match_detail()
@bp.route('/download/zip/<int:match_id>')
@login_required
@require_active_user
def download_zip(match_id):
"""Download ZIP file for a match"""
try:
from app.models import Match
from flask import send_file, abort
# Get the match
if current_user.is_admin:
match = Match.query.get_or_404(match_id)
else:
match = Match.query.filter_by(id=match_id, created_by=current_user.id).first_or_404()
# Check if ZIP file exists and is completed
if not match.zip_filename or match.zip_upload_status != 'completed':
flash('ZIP file not available for this match', 'error')
abort(404)
# Construct file path
zip_path = os.path.join(current_app.config.get('UPLOAD_FOLDER', 'uploads'), match.zip_filename)
if not os.path.exists(zip_path):
flash('ZIP file not found on disk', 'error')
abort(404)
# Log the download
logger.info(f"ZIP file downloaded: {match.zip_filename} by user {current_user.username}")
return send_file(zip_path, as_attachment=True, download_name=match.zip_filename)
except Exception as e:
logger.error(f"ZIP download error: {str(e)}")
flash('Error downloading ZIP file', 'error')
abort(500)
\ No newline at end of file
...@@ -123,6 +123,9 @@ class Match(db.Model): ...@@ -123,6 +123,9 @@ class Match(db.Model):
fixture_id = db.Column(db.String(255), nullable=False, index=True) fixture_id = db.Column(db.String(255), nullable=False, index=True)
active_status = db.Column(db.Boolean, default=False, index=True) active_status = db.Column(db.Boolean, default=False, index=True)
# Fixture active time (unix timestamp when all matches in fixture become active)
fixture_active_time = db.Column(db.BigInteger, nullable=True, index=True)
# ZIP file related fields # ZIP file related fields
zip_filename = db.Column(db.String(1024)) zip_filename = db.Column(db.String(1024))
zip_sha1sum = db.Column(db.String(255), index=True) zip_sha1sum = db.Column(db.String(255), index=True)
...@@ -159,9 +162,166 @@ class Match(db.Model): ...@@ -159,9 +162,166 @@ class Match(db.Model):
if self.zip_upload_status == 'completed' and self.zip_sha1sum: if self.zip_upload_status == 'completed' and self.zip_sha1sum:
self.active_status = True self.active_status = True
db.session.commit() db.session.commit()
# Check if all matches in this fixture are now active
self.update_fixture_active_time()
return True return True
return False return False
def update_fixture_active_time(self):
"""Update fixture active time if all matches in fixture are now active"""
try:
import time
# Get all matches in this fixture
all_matches = Match.query.filter_by(fixture_id=self.fixture_id).all()
# Check if all matches are active
all_active = all([match.active_status for match in all_matches])
if all_active:
# Update fixture_active_time for all matches in this fixture if not already set
current_time = int(time.time())
# Only update if not already set (preserve the original activation time)
matches_to_update = Match.query.filter_by(
fixture_id=self.fixture_id,
fixture_active_time=None
).all()
for match in matches_to_update:
match.fixture_active_time = current_time
if matches_to_update:
db.session.commit()
import logging
logger = logging.getLogger(__name__)
logger.info(f"Updated fixture {self.fixture_id} active time to {current_time}")
except Exception as e:
import logging
logger = logging.getLogger(__name__)
logger.error(f"Failed to update fixture active time for {self.fixture_id}: {str(e)}")
@classmethod
def get_fixtures_with_active_time(cls, from_timestamp=None, limit=None):
"""Get fixtures ordered by active time, optionally from a specific timestamp with limit"""
try:
from sqlalchemy import func, and_
# Base query to get fixture data with active time
query = db.session.query(
cls.fixture_id,
cls.fixture_active_time,
cls.filename,
func.min(cls.created_at).label('created_at')
).filter(
cls.fixture_active_time.isnot(None)
).group_by(
cls.fixture_id,
cls.fixture_active_time,
cls.filename
)
# Apply timestamp filter if provided
if from_timestamp is not None:
query = query.filter(cls.fixture_active_time > from_timestamp)
# Order by active time
query = query.order_by(cls.fixture_active_time.asc())
# Apply limit if provided
if limit is not None:
query = query.limit(limit)
return query.all()
except Exception as e:
import logging
logger = logging.getLogger(__name__)
logger.error(f"Failed to get fixtures with active time: {str(e)}")
return []
@classmethod
def get_last_activated_fixtures(cls, limit=10):
"""Get the last N activated fixtures ordered by active time (descending)"""
try:
from sqlalchemy import func, desc
# Get fixtures ordered by active time descending (most recent first)
query = db.session.query(
cls.fixture_id,
cls.fixture_active_time,
cls.filename,
func.min(cls.created_at).label('created_at')
).filter(
cls.fixture_active_time.isnot(None)
).group_by(
cls.fixture_id,
cls.fixture_active_time,
cls.filename
).order_by(desc(cls.fixture_active_time)).limit(limit)
return query.all()
except Exception as e:
import logging
logger = logging.getLogger(__name__)
logger.error(f"Failed to get last activated fixtures: {str(e)}")
return []
@classmethod
def backfill_fixture_active_times(cls):
"""Backfill fixture_active_time for existing active fixtures that don't have it set"""
try:
import time
from sqlalchemy import func, and_
import logging
logger = logging.getLogger(__name__)
# Find fixtures where all matches are active but fixture_active_time is NULL
fixtures_needing_update = db.session.query(
cls.fixture_id
).filter(
and_(
cls.active_status == True,
cls.fixture_active_time.is_(None)
)
).group_by(cls.fixture_id).all()
updated_count = 0
current_time = int(time.time())
for fixture_row in fixtures_needing_update:
fixture_id = fixture_row.fixture_id
# Check if ALL matches in this fixture are active
all_matches = cls.query.filter_by(fixture_id=fixture_id).all()
all_active = all([match.active_status for match in all_matches])
if all_active:
# Update all matches in this fixture to have the current timestamp
matches_to_update = cls.query.filter_by(
fixture_id=fixture_id,
fixture_active_time=None
).all()
for match in matches_to_update:
match.fixture_active_time = current_time
updated_count += len(matches_to_update)
if updated_count > 0:
db.session.commit()
logger.info(f"Backfilled fixture_active_time for {updated_count} matches")
return updated_count
except Exception as e:
import logging
logger = logging.getLogger(__name__)
logger.error(f"Failed to backfill fixture active times: {str(e)}")
return 0
def add_outcome(self, column_name, float_value): def add_outcome(self, column_name, float_value):
"""Add an outcome to this match""" """Add an outcome to this match"""
outcome = MatchOutcome( outcome = MatchOutcome(
...@@ -195,6 +355,7 @@ class Match(db.Model): ...@@ -195,6 +355,7 @@ class Match(db.Model):
'zip_sha1sum': self.zip_sha1sum, 'zip_sha1sum': self.zip_sha1sum,
'zip_upload_status': self.zip_upload_status, 'zip_upload_status': self.zip_upload_status,
'zip_upload_progress': float(self.zip_upload_progress) if self.zip_upload_progress else 0.0, 'zip_upload_progress': float(self.zip_upload_progress) if self.zip_upload_progress else 0.0,
'fixture_active_time': self.fixture_active_time,
'created_at': self.created_at.isoformat() if self.created_at else None, 'created_at': self.created_at.isoformat() if self.created_at else None,
'updated_at': self.updated_at.isoformat() if self.updated_at else None 'updated_at': self.updated_at.isoformat() if self.updated_at else None
} }
...@@ -625,6 +786,7 @@ class SystemSettings(db.Model): ...@@ -625,6 +786,7 @@ class SystemSettings(db.Model):
('max_upload_size_mb', 2048, 'integer', 'Maximum file upload size in MB'), ('max_upload_size_mb', 2048, 'integer', 'Maximum file upload size in MB'),
('session_timeout_hours', 24, 'integer', 'User session timeout in hours'), ('session_timeout_hours', 24, 'integer', 'User session timeout in hours'),
('api_rate_limit_per_minute', 60, 'integer', 'API rate limit per minute per IP'), ('api_rate_limit_per_minute', 60, 'integer', 'API rate limit per minute per IP'),
('api_updates_default_count', 10, 'integer', 'Default number of fixtures returned by /api/updates when no from parameter is provided'),
] ]
for key, default_value, value_type, description in defaults: for key, default_value, value_type, description in defaults:
......
...@@ -768,14 +768,18 @@ ...@@ -768,14 +768,18 @@
}, 3000); }, 3000);
} }
// Prevent form submission on file change (we handle it with AJAX) // Prevent form submission only for file upload forms (we handle them with AJAX)
document.addEventListener('DOMContentLoaded', function() { document.addEventListener('DOMContentLoaded', function() {
const uploadForms = document.querySelectorAll('.upload-form'); // Only prevent submission for forms that contain file inputs (upload forms)
uploadForms.forEach(form => { const fileUploadForms = document.querySelectorAll('.upload-form');
fileUploadForms.forEach(form => {
const hasFileInput = form.querySelector('input[type="file"]');
if (hasFileInput) {
form.addEventListener('submit', function(e) { form.addEventListener('submit', function(e) {
e.preventDefault(); e.preventDefault();
return false; return false;
}); });
}
}); });
// Handle individual match file inputs // Handle individual match file inputs
......
...@@ -890,13 +890,16 @@ ...@@ -890,13 +890,16 @@
}); });
}); });
// Prevent form submission on file change (we handle it with AJAX) // Only prevent submission for forms that contain file inputs (upload forms)
const uploadForms = document.querySelectorAll('.upload-form'); const uploadForms = document.querySelectorAll('.upload-form');
uploadForms.forEach(form => { uploadForms.forEach(form => {
const hasFileInput = form.querySelector('input[type="file"]');
if (hasFileInput) {
form.addEventListener('submit', function(e) { form.addEventListener('submit', function(e) {
e.preventDefault(); e.preventDefault();
return false; return false;
}); });
}
}); });
}); });
......
...@@ -208,8 +208,21 @@ class FileUploadHandler: ...@@ -208,8 +208,21 @@ class FileUploadHandler:
# Generate secure filename # Generate secure filename
original_filename = file.filename original_filename = file.filename
sanitized_name = sanitize_filename(original_filename) sanitized_name = sanitize_filename(original_filename)
if file_type == 'zip':
# For ZIP files, use SHA1 of unix timestamp + original filename
import time
unix_timestamp = str(int(time.time()))
hash_input = unix_timestamp + original_filename
sha1_hash = hashlib.sha1(hash_input.encode('utf-8')).hexdigest()
# Extract file extension from original filename
_, ext = os.path.splitext(sanitized_name)
filename = sha1_hash + (ext or '.zip')
else:
# For other files, use timestamp-based naming
timestamp = datetime.utcnow().strftime('%Y%m%d_%H%M%S') timestamp = datetime.utcnow().strftime('%Y%m%d_%H%M%S')
filename = f"{timestamp}_{sanitized_name}" filename = "{}_{}".format(timestamp, sanitized_name)
# Create upload directory if it doesn't exist # Create upload directory if it doesn't exist
if self.upload_folder is None: if self.upload_folder is None:
......
...@@ -40,9 +40,35 @@ class FixtureParser: ...@@ -40,9 +40,35 @@ class FixtureParser:
return extension return extension
return 'csv' return 'csv'
def find_header_row(self, df: pd.DataFrame) -> Optional[int]:
"""
Find the row that contains 'Match #' in the first column
Args:
df: DataFrame to search in
Returns:
int or None: Row index of header row, or None if not found
"""
try:
for index, row in df.iterrows():
# Check the first column (index 0)
if len(row) > 0:
first_cell = str(row.iloc[0]).strip().lower()
if 'match #' in first_cell or 'match_number' in first_cell or 'match no' in first_cell:
logger.info(f"Found header row at index {index}: '{row.iloc[0]}'")
return index
logger.warning("Header row with 'Match #' not found")
return None
except Exception as e:
logger.error(f"Error finding header row: {str(e)}")
return None
def read_file(self, file_path: str) -> Optional[pd.DataFrame]: def read_file(self, file_path: str) -> Optional[pd.DataFrame]:
""" """
Read file into pandas DataFrame with format detection Read file into pandas DataFrame with format detection and header row detection
Args: Args:
file_path: Path to the file file_path: Path to the file
...@@ -57,9 +83,23 @@ class FixtureParser: ...@@ -57,9 +83,23 @@ class FixtureParser:
# Try different encodings for CSV # Try different encodings for CSV
for encoding in self.encoding_options: for encoding in self.encoding_options:
try: try:
df = pd.read_csv(file_path, encoding=encoding) # First read without header to find the correct header row
df_raw = pd.read_csv(file_path, encoding=encoding, header=None)
logger.info(f"Successfully read CSV with encoding: {encoding}") logger.info(f"Successfully read CSV with encoding: {encoding}")
# Find the header row
header_row_index = self.find_header_row(df_raw)
if header_row_index is not None:
# Re-read with correct header row
df = pd.read_csv(file_path, encoding=encoding, header=header_row_index)
logger.info(f"Using header row at index {header_row_index}")
return df return df
else:
# Fallback to first row as header if no match found
df = pd.read_csv(file_path, encoding=encoding)
logger.warning("No 'Match #' header found, using first row as header")
return df
except UnicodeDecodeError: except UnicodeDecodeError:
continue continue
except Exception as e: except Exception as e:
...@@ -68,19 +108,40 @@ class FixtureParser: ...@@ -68,19 +108,40 @@ class FixtureParser:
# If all encodings fail, try with error handling # If all encodings fail, try with error handling
try: try:
df = pd.read_csv(file_path, encoding='utf-8', errors='replace') df_raw = pd.read_csv(file_path, encoding='utf-8', errors='replace', header=None)
logger.warning("Read CSV with character replacement due to encoding issues") logger.warning("Read CSV with character replacement due to encoding issues")
header_row_index = self.find_header_row(df_raw)
if header_row_index is not None:
df = pd.read_csv(file_path, encoding='utf-8', errors='replace', header=header_row_index)
return df
else:
df = pd.read_csv(file_path, encoding='utf-8', errors='replace')
return df return df
except Exception as e: except Exception as e:
logger.error(f"Final CSV read attempt failed: {str(e)}") logger.error(f"Final CSV read attempt failed: {str(e)}")
return None return None
elif file_format in ['xlsx', 'xls']: elif file_format in ['xlsx', 'xls']:
try: try:
# Try to read Excel file # First read without header to find the correct header row
df = pd.read_excel(file_path, engine='openpyxl' if file_format == 'xlsx' else 'xlrd') df_raw = pd.read_excel(file_path, engine='openpyxl' if file_format == 'xlsx' else 'xlrd', header=None)
logger.info(f"Successfully read {file_format.upper()} file") logger.info(f"Successfully read {file_format.upper()} file")
# Find the header row
header_row_index = self.find_header_row(df_raw)
if header_row_index is not None:
# Re-read with correct header row
df = pd.read_excel(file_path, engine='openpyxl' if file_format == 'xlsx' else 'xlrd', header=header_row_index)
logger.info(f"Using header row at index {header_row_index}")
return df return df
else:
# Fallback to first row as header if no match found
df = pd.read_excel(file_path, engine='openpyxl' if file_format == 'xlsx' else 'xlrd')
logger.warning("No 'Match #' header found, using first row as header")
return df
except Exception as e: except Exception as e:
logger.error(f"Excel read error: {str(e)}") logger.error(f"Excel read error: {str(e)}")
return None return None
......
...@@ -610,12 +610,23 @@ def api_upload_zip_stream(match_id): ...@@ -610,12 +610,23 @@ def api_upload_zip_stream(match_id):
match.zip_upload_status = 'uploading' match.zip_upload_status = 'uploading'
db.session.commit() db.session.commit()
# Generate secure filename # Generate secure filename using SHA1 for ZIP files
from werkzeug.utils import secure_filename from werkzeug.utils import secure_filename
from datetime import datetime from datetime import datetime
import time
import hashlib
sanitized_name = secure_filename(filename) sanitized_name = secure_filename(filename)
timestamp = datetime.utcnow().strftime('%Y%m%d_%H%M%S')
final_filename = f"{timestamp}_{sanitized_name}" # Use SHA1 naming for ZIP files (streaming uploads are typically ZIP files)
unix_timestamp = str(int(time.time()))
hash_input = unix_timestamp + filename
sha1_hash = hashlib.sha1(hash_input.encode('utf-8')).hexdigest()
# Extract file extension from original filename
import os
_, ext = os.path.splitext(sanitized_name)
final_filename = sha1_hash + (ext or '.zip')
# Create upload record # Create upload record
file_handler = get_file_upload_handler() file_handler = get_file_upload_handler()
......
#!/usr/bin/env python
"""
Backfill script to set fixture_active_time for existing active fixtures
Run this once to update existing data after adding the fixture_active_time column
"""
import sys
import os
# Add the project root to Python path
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
from app import create_app, db
from app.models import Match
import logging
def main():
"""Backfill fixture active times for existing data"""
app = create_app()
with app.app_context():
print("Starting backfill of fixture_active_time...")
# Run the backfill
updated_count = Match.backfill_fixture_active_times()
print(f"Backfill completed. Updated {updated_count} matches.")
# Verify the results
fixtures_with_time = db.session.query(Match.fixture_id).filter(
Match.fixture_active_time.isnot(None)
).distinct().count()
active_fixtures_total = db.session.query(Match.fixture_id).filter(
Match.active_status == True
).distinct().count()
print(f"Fixtures with active time: {fixtures_with_time}")
print(f"Total active fixtures: {active_fixtures_total}")
if fixtures_with_time > 0:
print("✅ Backfill successful! The /api/updates endpoint should now return data.")
else:
print("⚠️ No fixtures were updated. Check if there are active fixtures in the database.")
if __name__ == '__main__':
main()
\ No newline at end of file
#!/usr/bin/env python
import os
import sys
# Add the project directory to Python path
sys.path.insert(0, os.path.dirname(os.path.abspath(__file__)))
def run_migration():
"""Simple migration runner script"""
try:
# Set environment variable for Flask
os.environ['FLASK_ENV'] = 'development'
# Import Flask app
from app import create_app, db
# Create app context
app = create_app()
with app.app_context():
print("Creating database tables...")
db.create_all()
print("Running migrations...")
from app.database.migrations import run_migrations
result = run_migrations()
print("Migration result:", result)
if result['status'] == 'success':
print("SUCCESS: All migrations completed successfully!")
return True
elif result['status'] == 'partial':
print("PARTIAL: Some migrations failed")
print("Applied:", result.get('applied_count', 0))
print("Failed:", result.get('failed_count', 0))
return False
else:
print("ERROR: Migration failed")
return False
except Exception as e:
print("ERROR: Migration failed with exception:", str(e))
import traceback
traceback.print_exc()
return False
if __name__ == '__main__':
success = run_migration()
sys.exit(0 if success else 1)
\ No newline at end of file
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment