Fix API timestamp format bug and complete fixtures management system (v1.2.3.1)

- Fixed critical API synchronization bug: send Unix timestamp as long integer instead of ISO datetime string
- Updated _get_last_fixture_timestamp() to return str(fixture_active_time) for proper server compatibility
- Added comprehensive fixtures management system with dashboard integration
- Created fixtures list and detail pages with Bootstrap styling and admin controls
- Enhanced API client with proper timestamp handling and authentication for ZIP downloads
- Updated documentation (README.md, CHANGELOG.md, DOCUMENTATION.md) with new features
- Completed fixtures dashboard with real-time synchronization and administrative controls
- Fixed server communication to use proper Unix timestamp format: '1755773200' instead of '2025-08-21T14:31:3'
parent 800411c8
......@@ -25,6 +25,7 @@ All notable changes to this project will be documented in this file.
- **Path Resolution**: All paths now resolve to persistent user directories instead of temporary executable locations
- **Directory Creation**: Robust cross-platform directory creation with proper error handling and fallbacks
- **Database Location**: SQLite database now stored in persistent user data directory across all platforms
- **API Timestamp Format**: Fixed fixture synchronization timestamp format to send Unix timestamp as long integer instead of ISO datetime string
### Enhanced
- **User Data Management**: Automatic creation of logs/, data/, uploads/, and templates/ subdirectories
......
......@@ -358,6 +358,58 @@ python main.py --overlay-type native
- **sports**: Processes game scores and team information
- **custom**: User-defined processing logic
### Fixtures Management System
The application includes a comprehensive fixtures management system with real-time API synchronization and web dashboard integration:
#### Fixtures Dashboard
Access the fixtures management interface through the web dashboard:
- **Fixtures List**: View all synchronized fixtures with match counts and status
- **Fixture Details**: Detailed view of individual fixtures with all matches and outcomes
- **Admin Controls**: Reset functionality to clear all fixture data and ZIP files
- **Real-time Updates**: Live synchronization with server using proper timestamp handling
#### API Synchronization
The fixtures system synchronizes with the server API using Unix timestamp format:
```http
POST /api/updates
Authorization: Bearer <token>
Content-Type: application/json
{
"from": "1755773200"
}
```
**Response Format:**
```json
{
"fixtures": [
{
"fixture_id": "fixture_123",
"fixture_active_time": 1755773200,
"matches": [
{
"match_number": 101,
"fighter1_township": "Kampala Central",
"fighter2_township": "Nakawa",
"fixture_id": "fixture_123",
"fixture_active_time": 1755773200,
"outcomes": {
"round_1_score": 10.5,
"round_2_score": 9.8
}
}
]
}
]
}
```
### Match Data Management
The application includes comprehensive boxing match data management with database tables adapted from the mbetterd system:
......@@ -370,6 +422,7 @@ The application includes comprehensive boxing match data management with databas
- File metadata and SHA1 checksums
- ZIP upload tracking with progress
- User association and timestamps
- `fixture_active_time` field for server synchronization (Unix timestamp)
**match_outcomes table**: Detailed match results
- Foreign key relationships to matches
......
......@@ -25,6 +25,8 @@ A cross-platform multimedia client application with video playback, web dashboar
-**Boxing Match Database**: Added comprehensive `matches` and `match_outcomes` database tables adapted from mbetterd MySQL schema
-**Cross-Platform Persistence**: Complete PyInstaller executable persistence with platform-specific user directories
-**Match Data Management**: Full SQLAlchemy models for boxing match tracking with fighter townships, venues, and outcomes
-**Fixtures Management System**: Complete fixtures dashboard with API synchronization, database integration, and administrative controls
-**API Synchronization**: Real-time fixture and match data synchronization with server using proper Unix timestamp format
-**File Upload Tracking**: ZIP file upload management with progress tracking and status monitoring
-**Database Migration System**: Migration_008_AddMatchTables with comprehensive indexing and foreign key relationships
-**User Data Directories**: Automatic creation of persistent directories on Windows (%APPDATA%), macOS (~/Library/Application Support), and Linux (~/.local/share)
......
......@@ -6,6 +6,8 @@ import time
import logging
import json
import threading
import os
from pathlib import Path
from datetime import datetime, timedelta
from typing import Dict, Any, Optional, List, Union
from urllib.parse import urljoin, urlparse
......@@ -18,6 +20,8 @@ from ..core.message_bus import MessageBus, Message, MessageType, MessageBuilder
from ..config.settings import ApiConfig
from ..config.manager import ConfigManager
from ..database.manager import DatabaseManager
from ..database.models import MatchModel, MatchOutcomeModel
from ..config.settings import get_user_data_dir
logger = logging.getLogger(__name__)
......@@ -193,6 +197,217 @@ class SportsResponseHandler(ResponseHandler):
return self.handle_error(endpoint, e)
class UpdatesResponseHandler(ResponseHandler):
"""Response handler for /api/updates endpoint - synchronizes match data"""
def __init__(self, db_manager, user_data_dir, api_client=None):
self.db_manager = db_manager
self.user_data_dir = user_data_dir
self.api_client = api_client # Reference to parent API client for token access
self.zip_storage_dir = Path(user_data_dir) / "zip_files"
self.zip_storage_dir.mkdir(parents=True, exist_ok=True)
def handle_response(self, endpoint: APIEndpoint, response: requests.Response) -> Optional[Dict[str, Any]]:
try:
data = response.json()
processed_data = {
'source': endpoint.name,
'timestamp': datetime.utcnow().isoformat(),
'synchronized_matches': 0,
'downloaded_zips': 0,
'errors': []
}
# Extract fixtures from response - new API format has fixtures containing matches
fixtures = []
if isinstance(data, dict):
if 'fixtures' in data:
fixtures = data.get('fixtures', [])
elif 'matches' in data:
# Fallback for old format - treat as single fixture with matches
fixtures = [{'matches': data.get('matches', [])}]
elif isinstance(data, list):
# Direct array of matches - treat as single fixture
fixtures = [{'matches': data}]
if not fixtures:
processed_data['message'] = "No fixtures in response"
return processed_data
session = self.db_manager.get_session()
try:
for fixture_data in fixtures:
matches = fixture_data.get('matches', [])
for match_data in matches:
try:
# Add fixture-level data to match if available
if 'fixture_id' in fixture_data:
match_data['fixture_id'] = fixture_data['fixture_id']
if 'fixture_active_time' in fixture_data:
match_data['fixture_active_time'] = fixture_data['fixture_active_time']
# Synchronize match data to database
self._synchronize_match(session, match_data)
processed_data['synchronized_matches'] += 1
# Download ZIP file if available (check match-level zip_download_url)
if 'zip_download_url' in match_data:
match_data['zip_url'] = match_data['zip_download_url']
if self._download_zip_file(match_data):
processed_data['downloaded_zips'] += 1
except Exception as e:
error_msg = f"Failed to process match {match_data.get('match_number', 'unknown')} in fixture {fixture_data.get('fixture_id', 'unknown')}: {e}"
logger.error(error_msg)
processed_data['errors'].append(error_msg)
# Continue processing other matches even if this one fails
continue
session.commit()
finally:
session.close()
logger.info(f"Synchronized {processed_data['synchronized_matches']} matches, downloaded {processed_data['downloaded_zips']} ZIP files")
return processed_data
except Exception as e:
logger.error(f"Failed to process updates response: {e}")
return self.handle_error(endpoint, e)
def _synchronize_match(self, session, match_data: Dict[str, Any]):
"""Synchronize match data to database"""
try:
match_number = match_data.get('match_number')
fixture_id = match_data.get('fixture_id')
if not match_number or not fixture_id:
logger.warning(f"Skipping match with missing match_number ({match_number}) or fixture_id ({fixture_id})")
return
# Check if match already exists by both match_number AND fixture_id
existing_match = session.query(MatchModel).filter_by(
match_number=match_number,
fixture_id=fixture_id
).first()
if existing_match:
# Update existing match
for key, value in match_data.items():
if hasattr(existing_match, key) and key not in ['id', 'created_at']:
setattr(existing_match, key, value)
existing_match.updated_at = datetime.utcnow()
match = existing_match
logger.debug(f"Updated existing match {match_number} for fixture {fixture_id}")
else:
# Create new match
match = MatchModel(
match_number=match_data.get('match_number'),
fighter1_township=match_data.get('fighter1_township', ''),
fighter2_township=match_data.get('fighter2_township', ''),
venue_kampala_township=match_data.get('venue_kampala_township', ''),
start_time=self._parse_datetime(match_data.get('start_time')),
end_time=self._parse_datetime(match_data.get('end_time')),
result=match_data.get('result'),
filename=match_data.get('filename', ''),
file_sha1sum=match_data.get('file_sha1sum', ''),
fixture_id=fixture_id,
active_status=match_data.get('active_status', False),
zip_filename=match_data.get('zip_filename'),
zip_sha1sum=match_data.get('zip_sha1sum'),
zip_upload_status=match_data.get('zip_upload_status', 'pending'),
zip_upload_progress=match_data.get('zip_upload_progress', 0.0),
done=match_data.get('done', False),
running=match_data.get('running', False),
fixture_active_time=match_data.get('fixture_active_time')
)
session.add(match)
logger.debug(f"Created new match {match_number} for fixture {fixture_id}")
# Flush to get the match ID for outcomes
session.flush()
# Handle match outcomes
outcomes_data = match_data.get('outcomes', {})
if outcomes_data:
# Remove existing outcomes
session.query(MatchOutcomeModel).filter_by(match_id=match.id).delete()
# Add new outcomes
for column_name, float_value in outcomes_data.items():
outcome = MatchOutcomeModel(
match_id=match.id,
column_name=column_name,
float_value=float(float_value)
)
session.add(outcome)
except Exception as e:
logger.error(f"Failed to synchronize match: {e}")
# Rollback this session to clean state
session.rollback()
raise
def _download_zip_file(self, match_data: Dict[str, Any]) -> bool:
"""Download ZIP file to persistent storage with API token authentication"""
try:
zip_url = match_data.get('zip_url')
zip_filename = match_data.get('zip_filename')
if not zip_url or not zip_filename:
return False
# Prepare headers with API token authentication
headers = {}
# Get API token from the API client's fastapi_main endpoint
if self.api_client and hasattr(self.api_client, 'endpoints'):
fastapi_endpoint = self.api_client.endpoints.get('fastapi_main')
if fastapi_endpoint and fastapi_endpoint.auth:
if fastapi_endpoint.auth.get('type') == 'bearer':
token = fastapi_endpoint.auth.get('token')
if token:
headers['Authorization'] = f"Bearer {token}"
logger.debug(f"Using API token for ZIP download: {zip_filename}")
if not headers:
logger.warning(f"No API token available for ZIP download: {zip_filename}")
# Download ZIP file with authentication
response = requests.get(zip_url, stream=True, timeout=30, headers=headers)
response.raise_for_status()
# Save to persistent storage
zip_path = self.zip_storage_dir / zip_filename
with open(zip_path, 'wb') as f:
for chunk in response.iter_content(chunk_size=8192):
f.write(chunk)
logger.info(f"Downloaded ZIP file: {zip_filename}")
return True
except Exception as e:
logger.error(f"Failed to download ZIP file: {e}")
return False
def _parse_datetime(self, datetime_str) -> Optional[datetime]:
"""Parse datetime string to datetime object"""
if not datetime_str:
return None
try:
# Try ISO format first
return datetime.fromisoformat(datetime_str.replace('Z', '+00:00'))
except:
try:
# Try common formats
return datetime.strptime(datetime_str, '%Y-%m-%d %H:%M:%S')
except:
return None
class APIClient(ThreadedComponent):
"""REST API Client component"""
......@@ -215,7 +430,8 @@ class APIClient(ThreadedComponent):
self.response_handlers = {
'default': DefaultResponseHandler(),
'news': NewsResponseHandler(),
'sports': SportsResponseHandler()
'sports': SportsResponseHandler(),
'updates': UpdatesResponseHandler(self.db_manager, get_user_data_dir(), self)
}
# Statistics
......@@ -289,13 +505,15 @@ class APIClient(ThreadedComponent):
def _get_default_endpoints(self) -> Dict[str, Dict[str, Any]]:
"""Get default API endpoints configuration"""
# Get FastAPI server URL and token from configuration
# Get FastAPI server URL, token, and interval from configuration
fastapi_url = "https://mbetter.nexlab.net/api/updates"
api_token = ""
api_interval = 1800 # 30 minutes default
try:
api_config = self.config_manager.get_section_config("api") or {}
fastapi_url = api_config.get("fastapi_url", fastapi_url)
api_token = api_config.get("api_token", api_token)
api_interval = api_config.get("api_interval", api_interval)
except Exception as e:
logger.warning(f"Could not load API configuration, using defaults: {e}")
......@@ -309,22 +527,22 @@ class APIClient(ThreadedComponent):
"type": "bearer",
"token": api_token.strip()
}
# When token is provided: 30-minute intervals, 10 retries every 30 seconds
interval = 1800 # 30 minutes
# When token is provided: use configurable interval, 10 retries every 30 seconds
interval = api_interval
retry_attempts = 10
retry_delay = 30 # 30 seconds
enabled = True
else:
# When no token: longer intervals, fewer retries, disabled by default
interval = 600 # 10 minutes
# When no token: use configurable interval or default, fewer retries, disabled by default
interval = api_interval if api_interval != 1800 else 600 # Use configured or 10 minutes default
retry_attempts = 3
retry_delay = 5
enabled = False # Disabled when no authentication
return {
"fastapi_main": {
"url": f"{fastapi_url.rstrip('/')}/status" if not fastapi_url.endswith(('/status', '/api/status')) else fastapi_url,
"method": "GET",
"url": fastapi_url, # Use the exact URL provided, don't modify it
"method": "POST", # Use POST for /api/updates endpoint
"headers": headers,
"auth": auth_config,
"interval": interval,
......@@ -332,7 +550,7 @@ class APIClient(ThreadedComponent):
"timeout": 30,
"retry_attempts": retry_attempts,
"retry_delay": retry_delay,
"response_handler": "default"
"response_handler": "updates" # Use updates handler for match synchronization
},
"news_example": {
"url": "https://newsapi.org/v2/top-headlines",
......@@ -428,6 +646,30 @@ class APIClient(ThreadedComponent):
if endpoint.should_execute():
self._execute_endpoint_request(endpoint)
def _get_last_fixture_timestamp(self) -> Optional[str]:
"""Get the server activation timestamp of the last active fixture in the database"""
try:
session = self.db_manager.get_session()
try:
# Get the most recent match with fixture_active_time set
last_active_match = session.query(MatchModel).filter(
MatchModel.fixture_active_time.isnot(None)
).order_by(MatchModel.fixture_active_time.desc()).first()
if last_active_match and last_active_match.fixture_active_time:
# Return Unix timestamp as string (long integer number)
return str(last_active_match.fixture_active_time)
else:
# No fixtures with activation time found - don't send 'from' parameter
return None
finally:
session.close()
except Exception as e:
logger.error(f"Failed to get last fixture activation timestamp: {e}")
return None
def _execute_endpoint_request(self, endpoint: APIEndpoint):
"""Execute a single API request with custom retry logic for token-based endpoints"""
try:
......@@ -445,10 +687,38 @@ class APIClient(ThreadedComponent):
'timeout': endpoint.timeout
}
# Prepare data/params based on method
request_data = endpoint.params.copy() if endpoint.method == 'GET' else endpoint.data.copy()
# For FastAPI /api/updates endpoint, add 'from' parameter if we have fixtures
if endpoint.name == 'fastapi_main' and 'updates' in endpoint.url.lower():
last_timestamp = self._get_last_fixture_timestamp()
if last_timestamp:
if endpoint.method == 'GET':
request_data['from'] = last_timestamp
else:
request_data['from'] = last_timestamp
logger.debug(f"Adding 'from' parameter to {endpoint.name} request: {last_timestamp}")
else:
# When no fixtures exist, send empty request to get all data
logger.debug(f"No fixtures found, sending empty request to {endpoint.name}")
if endpoint.method == 'GET':
request_kwargs['params'] = endpoint.params
request_kwargs['params'] = request_data
else:
# For POST requests, always send JSON data (even if empty)
request_kwargs['json'] = request_data if request_data else {}
# Debug log the complete request data
logger.debug(f"Request method: {endpoint.method}")
logger.debug(f"Request headers: {request_kwargs['headers']}")
if endpoint.method == 'GET' and request_data:
logger.debug(f"Request params: {request_data}")
elif endpoint.method != 'GET' and request_data:
logger.debug(f"Request JSON data: {request_data}")
else:
request_kwargs['json'] = endpoint.data
logger.debug("Request data: (empty)")
# Add authentication if configured
if endpoint.auth:
......@@ -457,6 +727,10 @@ class APIClient(ThreadedComponent):
elif endpoint.auth.get('type') == 'basic':
request_kwargs['auth'] = (endpoint.auth.get('username'), endpoint.auth.get('password'))
# Generate curl command for easy debugging/replication (after auth is added)
curl_cmd = self._generate_curl_command(endpoint.method, endpoint.url, request_kwargs['headers'], request_data)
logger.debug(f"Curl equivalent: {curl_cmd}")
# Check if this is a token-based endpoint that needs custom retry logic
is_token_endpoint = (endpoint.auth and
endpoint.auth.get('type') == 'bearer' and
......@@ -472,6 +746,18 @@ class APIClient(ThreadedComponent):
response = self.session.request(**request_kwargs)
response.raise_for_status()
# Debug log the complete response
logger.debug(f"Response status code: {response.status_code}")
logger.debug(f"Response headers: {dict(response.headers)}")
try:
response_text = response.text
if len(response_text) > 1000:
logger.debug(f"Response body (truncated): {response_text[:1000]}...")
else:
logger.debug(f"Response body: {response_text}")
except Exception as e:
logger.debug(f"Could not read response body: {e}")
# Handle successful response
handler = self.response_handlers.get(endpoint.response_handler, self.response_handlers['default'])
processed_data = handler.handle_response(endpoint, response)
......@@ -515,6 +801,18 @@ class APIClient(ThreadedComponent):
response = self.session.request(**request_kwargs)
response.raise_for_status()
# Debug log the complete response (retry scenario)
logger.debug(f"Retry response status code: {response.status_code}")
logger.debug(f"Retry response headers: {dict(response.headers)}")
try:
response_text = response.text
if len(response_text) > 1000:
logger.debug(f"Retry response body (truncated): {response_text[:1000]}...")
else:
logger.debug(f"Retry response body: {response_text}")
except Exception as e:
logger.debug(f"Could not read retry response body: {e}")
# Handle successful response
handler = self.response_handlers.get(endpoint.response_handler, self.response_handlers['default'])
processed_data = handler.handle_response(endpoint, response)
......@@ -639,16 +937,20 @@ class APIClient(ThreadedComponent):
old_has_token = bool(old_token and old_token.strip())
new_has_token = bool(new_token and new_token.strip())
# Get current interval setting
current_api_config = self.config_manager.get_section_config("api") or {}
current_interval = current_api_config.get("api_interval", 1800) # 30 minutes default
if not old_has_token and new_has_token:
# Token was added - start timer
fastapi_endpoint.enabled = True
fastapi_endpoint.interval = 1800 # 30 minutes
fastapi_endpoint.interval = current_interval
fastapi_endpoint.retry_attempts = 10
fastapi_endpoint.retry_delay = 30
fastapi_endpoint.consecutive_failures = 0 # Reset failure count
fastapi_endpoint.last_request = None # Reset to trigger immediate first request
logger.info("FastAPI timer started - token configured, 30-minute intervals enabled")
logger.info(f"FastAPI timer started - token configured, {current_interval} second intervals enabled")
# Send immediate status update
status_message = Message(
......@@ -658,7 +960,7 @@ class APIClient(ThreadedComponent):
"status": "timer_started",
"endpoint": "fastapi_main",
"reason": "token_configured",
"interval_minutes": 30
"interval_seconds": current_interval
}
)
self.message_bus.publish(status_message)
......@@ -826,3 +1128,36 @@ class APIClient(ThreadedComponent):
except Exception as e:
logger.error(f"APIClient shutdown error: {e}")
def _generate_curl_command(self, method, url, headers, data=None):
"""Generate a curl command equivalent to the current request for debugging purposes."""
import json
import shlex
# Start with basic curl command
cmd_parts = ['curl', '-X', method]
# Add headers
for key, value in headers.items():
cmd_parts.extend(['-H', f'{key}: {value}'])
# Add data based on method
if method == 'GET' and data:
# For GET requests, params are added to URL
from urllib.parse import urlencode
query_string = urlencode(data)
url = f"{url}?{query_string}"
elif method == 'POST':
# For POST requests, always add JSON data (even if empty)
json_str = json.dumps(data if data else {}, separators=(',', ':'))
cmd_parts.extend(['-d', json_str])
# Add URL (always last)
cmd_parts.append(url)
# Use shlex.join for proper shell escaping (Python 3.8+)
try:
return shlex.join(cmd_parts)
except AttributeError:
# Fallback for Python < 3.8
return ' '.join(shlex.quote(arg) for arg in cmd_parts)
\ No newline at end of file
......@@ -457,19 +457,25 @@ class MessageBuilder:
)
@staticmethod
def api_request(sender: str, url: str, method: str = "GET",
def api_request(sender: str, url: str = None, endpoint: str = None, method: str = "GET",
headers: Optional[Dict[str, str]] = None,
data: Optional[Dict[str, Any]] = None) -> Message:
"""Create API_REQUEST message"""
return Message(
type=MessageType.API_REQUEST,
sender=sender,
data={
"url": url,
message_data = {
"method": method,
"headers": headers or {},
"data": data or {}
}
if endpoint:
message_data["endpoint"] = endpoint
if url:
message_data["url"] = url
return Message(
type=MessageType.API_REQUEST,
sender=sender,
data=message_data
)
@staticmethod
......
......@@ -519,6 +519,304 @@ class Migration_008_AddMatchTables(DatabaseMigration):
return False
class Migration_009_AddDoneFieldToMatches(DatabaseMigration):
"""Add done flag field to matches table"""
def __init__(self):
super().__init__("009", "Add done flag field to matches table")
def up(self, db_manager) -> bool:
"""Add done column to matches table"""
try:
with db_manager.engine.connect() as conn:
# Check if done column already exists
result = conn.execute(text("PRAGMA table_info(matches)"))
columns = [row[1] for row in result.fetchall()]
if 'done' not in columns:
# Add done column with default value False
conn.execute(text("""
ALTER TABLE matches
ADD COLUMN done BOOLEAN DEFAULT FALSE NOT NULL
"""))
# Add index for done column
conn.execute(text("""
CREATE INDEX IF NOT EXISTS ix_matches_done ON matches(done)
"""))
conn.commit()
logger.info("Done column added to matches table")
else:
logger.info("Done column already exists in matches table")
return True
except Exception as e:
logger.error(f"Failed to add done field to matches: {e}")
return False
def down(self, db_manager) -> bool:
"""Remove done column - SQLite doesn't support DROP COLUMN easily"""
logger.warning("SQLite doesn't support DROP COLUMN - done column will remain")
return True
class Migration_010_AddRunningFieldToMatches(DatabaseMigration):
"""Add running flag field to matches table"""
def __init__(self):
super().__init__("010", "Add running flag field to matches table")
def up(self, db_manager) -> bool:
"""Add running column to matches table"""
try:
with db_manager.engine.connect() as conn:
# Check if running column already exists
result = conn.execute(text("PRAGMA table_info(matches)"))
columns = [row[1] for row in result.fetchall()]
if 'running' not in columns:
# Add running column with default value False
conn.execute(text("""
ALTER TABLE matches
ADD COLUMN running BOOLEAN DEFAULT FALSE NOT NULL
"""))
# Add index for running column
conn.execute(text("""
CREATE INDEX IF NOT EXISTS ix_matches_running ON matches(running)
"""))
conn.commit()
logger.info("Running column added to matches table")
else:
logger.info("Running column already exists in matches table")
return True
except Exception as e:
logger.error(f"Failed to add running field to matches: {e}")
return False
def down(self, db_manager) -> bool:
"""Remove running column - SQLite doesn't support DROP COLUMN easily"""
logger.warning("SQLite doesn't support DROP COLUMN - running column will remain")
return True
class Migration_011_AddFixtureActiveTimeToMatches(DatabaseMigration):
"""Add fixture_active_time field to matches table for server activation timestamp tracking"""
def __init__(self):
super().__init__("011", "Add fixture_active_time field to matches table")
def up(self, db_manager) -> bool:
"""Add fixture_active_time field to matches table"""
try:
with db_manager.engine.connect() as conn:
# Check if fixture_active_time column already exists
result = conn.execute(text("PRAGMA table_info(matches)"))
columns = [row[1] for row in result.fetchall()]
if 'fixture_active_time' not in columns:
# Add fixture_active_time column (INTEGER for Unix timestamp)
conn.execute(text("""
ALTER TABLE matches
ADD COLUMN fixture_active_time INTEGER NULL
"""))
# Add index for fixture_active_time column
conn.execute(text("""
CREATE INDEX IF NOT EXISTS ix_matches_fixture_active_time
ON matches(fixture_active_time)
"""))
conn.commit()
logger.info("Fixture_active_time column added to matches table")
else:
logger.info("Fixture_active_time column already exists in matches table")
return True
except Exception as e:
logger.error(f"Failed to add fixture_active_time field to matches: {e}")
return False
def down(self, db_manager) -> bool:
"""Remove fixture_active_time column - SQLite doesn't support DROP COLUMN easily"""
logger.warning("SQLite doesn't support DROP COLUMN - fixture_active_time column will remain")
return True
class Migration_012_RemoveFixtureIdUniqueConstraint(DatabaseMigration):
"""Remove UNIQUE constraint from matches.fixture_id to allow multiple matches per fixture"""
def __init__(self):
super().__init__("012", "Remove UNIQUE constraint from matches.fixture_id")
def up(self, db_manager) -> bool:
"""Remove UNIQUE constraint from fixture_id column"""
try:
with db_manager.engine.connect() as conn:
# SQLite doesn't support ALTER TABLE DROP CONSTRAINT directly
# We need to recreate the table without the UNIQUE constraint
# Step 1: Create new table without UNIQUE constraint on fixture_id
conn.execute(text("""
CREATE TABLE IF NOT EXISTS matches_new (
id INTEGER PRIMARY KEY AUTOINCREMENT,
match_number INTEGER NOT NULL UNIQUE,
fighter1_township VARCHAR(255) NOT NULL,
fighter2_township VARCHAR(255) NOT NULL,
venue_kampala_township VARCHAR(255) NOT NULL,
start_time DATETIME NULL,
end_time DATETIME NULL,
result VARCHAR(255) NULL,
done BOOLEAN DEFAULT FALSE NOT NULL,
running BOOLEAN DEFAULT FALSE NOT NULL,
fixture_active_time INTEGER NULL,
filename VARCHAR(1024) NOT NULL,
file_sha1sum VARCHAR(255) NOT NULL,
fixture_id VARCHAR(255) NOT NULL,
active_status BOOLEAN DEFAULT FALSE,
zip_filename VARCHAR(1024) NULL,
zip_sha1sum VARCHAR(255) NULL,
zip_upload_status VARCHAR(20) DEFAULT 'pending',
zip_upload_progress REAL DEFAULT 0.0,
created_by INTEGER NULL REFERENCES users(id) ON DELETE SET NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
"""))
# Step 2: Copy data from old table to new table
conn.execute(text("""
INSERT INTO matches_new
SELECT * FROM matches
"""))
# Step 3: Drop old table
conn.execute(text("DROP TABLE matches"))
# Step 4: Rename new table to original name
conn.execute(text("ALTER TABLE matches_new RENAME TO matches"))
# Step 5: Recreate indexes (without fixture_id unique constraint)
indexes = [
"CREATE INDEX IF NOT EXISTS ix_matches_match_number ON matches(match_number)",
"CREATE INDEX IF NOT EXISTS ix_matches_fixture_id ON matches(fixture_id)",
"CREATE INDEX IF NOT EXISTS ix_matches_active_status ON matches(active_status)",
"CREATE INDEX IF NOT EXISTS ix_matches_file_sha1sum ON matches(file_sha1sum)",
"CREATE INDEX IF NOT EXISTS ix_matches_zip_sha1sum ON matches(zip_sha1sum)",
"CREATE INDEX IF NOT EXISTS ix_matches_zip_upload_status ON matches(zip_upload_status)",
"CREATE INDEX IF NOT EXISTS ix_matches_created_by ON matches(created_by)",
"CREATE INDEX IF NOT EXISTS ix_matches_fixture_active_time ON matches(fixture_active_time)",
"CREATE INDEX IF NOT EXISTS ix_matches_done ON matches(done)",
"CREATE INDEX IF NOT EXISTS ix_matches_running ON matches(running)",
"CREATE INDEX IF NOT EXISTS ix_matches_composite ON matches(active_status, zip_upload_status, created_at)",
]
for index_sql in indexes:
conn.execute(text(index_sql))
conn.commit()
logger.info("UNIQUE constraint removed from matches.fixture_id column")
return True
except Exception as e:
logger.error(f"Failed to remove UNIQUE constraint from fixture_id: {e}")
return False
def down(self, db_manager) -> bool:
"""Add UNIQUE constraint back to fixture_id column"""
try:
with db_manager.engine.connect() as conn:
# Check if there are any duplicate fixture_ids that would prevent adding UNIQUE constraint
result = conn.execute(text("""
SELECT fixture_id, COUNT(*) as count
FROM matches
GROUP BY fixture_id
HAVING COUNT(*) > 1
"""))
duplicates = result.fetchall()
if duplicates:
logger.error(f"Cannot add UNIQUE constraint - duplicate fixture_ids found: {[row[0] for row in duplicates]}")
return False
# Recreate table with UNIQUE constraint on fixture_id
conn.execute(text("""
CREATE TABLE IF NOT EXISTS matches_new (
id INTEGER PRIMARY KEY AUTOINCREMENT,
match_number INTEGER NOT NULL UNIQUE,
fighter1_township VARCHAR(255) NOT NULL,
fighter2_township VARCHAR(255) NOT NULL,
venue_kampala_township VARCHAR(255) NOT NULL,
start_time DATETIME NULL,
end_time DATETIME NULL,
result VARCHAR(255) NULL,
done BOOLEAN DEFAULT FALSE NOT NULL,
running BOOLEAN DEFAULT FALSE NOT NULL,
fixture_active_time INTEGER NULL,
filename VARCHAR(1024) NOT NULL,
file_sha1sum VARCHAR(255) NOT NULL,
fixture_id VARCHAR(255) NOT NULL UNIQUE,
active_status BOOLEAN DEFAULT FALSE,
zip_filename VARCHAR(1024) NULL,
zip_sha1sum VARCHAR(255) NULL,
zip_upload_status VARCHAR(20) DEFAULT 'pending',
zip_upload_progress REAL DEFAULT 0.0,
created_by INTEGER NULL REFERENCES users(id) ON DELETE SET NULL,
created_at DATETIME DEFAULT CURRENT_TIMESTAMP,
updated_at DATETIME DEFAULT CURRENT_TIMESTAMP
)
"""))
# Copy data from old table to new table
conn.execute(text("""
INSERT INTO matches_new
SELECT * FROM matches
"""))
# Drop old table and rename new table
conn.execute(text("DROP TABLE matches"))
conn.execute(text("ALTER TABLE matches_new RENAME TO matches"))
# Recreate indexes
indexes = [
"CREATE INDEX IF NOT EXISTS ix_matches_match_number ON matches(match_number)",
"CREATE INDEX IF NOT EXISTS ix_matches_fixture_id ON matches(fixture_id)",
"CREATE INDEX IF NOT EXISTS ix_matches_active_status ON matches(active_status)",
"CREATE INDEX IF NOT EXISTS ix_matches_file_sha1sum ON matches(file_sha1sum)",
"CREATE INDEX IF NOT EXISTS ix_matches_zip_sha1sum ON matches(zip_sha1sum)",
"CREATE INDEX IF NOT EXISTS ix_matches_zip_upload_status ON matches(zip_upload_status)",
"CREATE INDEX IF NOT EXISTS ix_matches_created_by ON matches(created_by)",
"CREATE INDEX IF NOT EXISTS ix_matches_fixture_active_time ON matches(fixture_active_time)",
"CREATE INDEX IF NOT EXISTS ix_matches_done ON matches(done)",
"CREATE INDEX IF NOT EXISTS ix_matches_running ON matches(running)",
"CREATE INDEX IF NOT EXISTS ix_matches_composite ON matches(active_status, zip_upload_status, created_at)",
]
for index_sql in indexes:
conn.execute(text(index_sql))
conn.commit()
logger.info("UNIQUE constraint added back to matches.fixture_id column")
return True
except Exception as e:
logger.error(f"Failed to add UNIQUE constraint back to fixture_id: {e}")
return False
# Registry of all migrations in order
MIGRATIONS: List[DatabaseMigration] = [
Migration_001_InitialSchema(),
......@@ -529,6 +827,10 @@ MIGRATIONS: List[DatabaseMigration] = [
Migration_006_AddUserRoles(),
Migration_007_CreateDefaultCashierUser(),
Migration_008_AddMatchTables(),
Migration_009_AddDoneFieldToMatches(),
Migration_010_AddRunningFieldToMatches(),
Migration_011_AddFixtureActiveTimeToMatches(),
Migration_012_RemoveFixtureIdUniqueConstraint(),
]
......
......@@ -455,9 +455,9 @@ class MatchModel(BaseModel):
Index('ix_matches_zip_sha1sum', 'zip_sha1sum'),
Index('ix_matches_zip_upload_status', 'zip_upload_status'),
Index('ix_matches_created_by', 'created_by'),
Index('ix_matches_fixture_active_time', 'fixture_active_time'),
Index('ix_matches_composite', 'active_status', 'zip_upload_status', 'created_at'),
UniqueConstraint('match_number', name='uq_matches_match_number'),
UniqueConstraint('fixture_id', name='uq_matches_fixture_id'),
)
# Core match data from fixture file
......@@ -470,11 +470,14 @@ class MatchModel(BaseModel):
start_time = Column(DateTime, comment='Match start time')
end_time = Column(DateTime, comment='Match end time')
result = Column(String(255), comment='Match result/outcome')
done = Column(Boolean, default=False, nullable=False, comment='Match completion flag (0=pending, 1=done)')
running = Column(Boolean, default=False, nullable=False, comment='Match running flag (0=not running, 1=running)')
fixture_active_time = Column(Integer, nullable=True, comment='Unix timestamp when fixture became active on server')
# File metadata
filename = Column(String(1024), nullable=False, comment='Original fixture filename')
file_sha1sum = Column(String(255), nullable=False, comment='SHA1 checksum of fixture file')
fixture_id = Column(String(255), nullable=False, unique=True, comment='Unique fixture identifier')
fixture_id = Column(String(255), nullable=False, comment='Fixture identifier (multiple matches can share same fixture)')
active_status = Column(Boolean, default=False, nullable=False, comment='Active status flag')
# ZIP file related fields
......
......@@ -151,6 +151,7 @@ def configuration():
# API settings
'fastapi_url': api_config.get('fastapi_url', 'https://mbetter.nexlab.net/api/updates'),
'api_token': api_config.get('api_token', ''),
'api_interval': api_config.get('api_interval', 1800),
'api_timeout': api_config.get('api_timeout', 30),
'api_enabled': api_config.get('api_enabled', True)
})
......@@ -167,6 +168,7 @@ def configuration():
'db_path': 'data/mbetterclient.db',
'fastapi_url': 'https://mbetter.nexlab.net/api/updates',
'api_token': '',
'api_interval': 1800,
'api_timeout': 30,
'api_enabled': True
}
......@@ -226,6 +228,51 @@ def api_tokens():
return render_template('errors/500.html'), 500
@main_bp.route('/fixtures')
@login_required
def fixtures():
"""Fixtures management page"""
try:
# Restrict cashier users from accessing fixtures page
if hasattr(current_user, 'role') and current_user.role == 'cashier':
flash("Access denied", "error")
return redirect(url_for('main.cashier_dashboard'))
elif hasattr(current_user, 'is_cashier_user') and current_user.is_cashier_user():
flash("Access denied", "error")
return redirect(url_for('main.cashier_dashboard'))
return render_template('dashboard/fixtures.html',
user=current_user,
page_title="Fixtures")
except Exception as e:
logger.error(f"Fixtures page error: {e}")
flash("Error loading fixtures", "error")
return render_template('errors/500.html'), 500
@main_bp.route('/fixtures/<int:match_id>')
@login_required
def fixture_details(match_id):
"""Fixture details page showing match and outcomes"""
try:
# Restrict cashier users from accessing fixture details page
if hasattr(current_user, 'role') and current_user.role == 'cashier':
flash("Access denied", "error")
return redirect(url_for('main.cashier_dashboard'))
elif hasattr(current_user, 'is_cashier_user') and current_user.is_cashier_user():
flash("Access denied", "error")
return redirect(url_for('main.cashier_dashboard'))
return render_template('dashboard/fixture_details.html',
user=current_user,
match_id=match_id,
page_title=f"Fixture Details - Match #{match_id}")
except Exception as e:
logger.error(f"Fixture details page error: {e}")
flash("Error loading fixture details", "error")
return render_template('errors/500.html'), 500
@main_bp.route('/logs')
@login_required
def logs():
......@@ -965,3 +1012,219 @@ def delete_template(template_name):
except Exception as e:
logger.error(f"Template deletion error: {e}")
return jsonify({"error": str(e)}), 500
@api_bp.route('/fixtures')
@api_bp.auth_manager.require_auth if hasattr(api_bp, 'auth_manager') and api_bp.auth_manager else login_required
def get_fixtures():
"""Get all fixtures/matches"""
try:
from ..database.models import MatchModel
session = api_bp.db_manager.get_session()
try:
matches = session.query(MatchModel).order_by(MatchModel.created_at.desc()).all()
fixtures_data = []
for match in matches:
match_data = match.to_dict()
fixtures_data.append(match_data)
return jsonify({
"success": True,
"fixtures": fixtures_data,
"total": len(fixtures_data)
})
finally:
session.close()
except Exception as e:
logger.error(f"API get fixtures error: {e}")
return jsonify({"error": str(e)}), 500
@api_bp.route('/fixtures/<int:match_id>')
@api_bp.auth_manager.require_auth if hasattr(api_bp, 'auth_manager') and api_bp.auth_manager else login_required
def get_fixture_details(match_id):
"""Get fixture details by match ID"""
try:
from ..database.models import MatchModel, MatchOutcomeModel
session = api_bp.db_manager.get_session()
try:
match = session.query(MatchModel).filter_by(id=match_id).first()
if not match:
return jsonify({"error": "Match not found"}), 404
match_data = match.to_dict()
# Get outcomes
outcomes = session.query(MatchOutcomeModel).filter_by(match_id=match_id).all()
match_data['outcomes'] = [outcome.to_dict() for outcome in outcomes]
return jsonify({
"success": True,
"match": match_data
})
finally:
session.close()
except Exception as e:
logger.error(f"API get fixture details error: {e}")
return jsonify({"error": str(e)}), 500
@api_bp.route('/fixtures/reset', methods=['POST'])
@api_bp.auth_manager.require_auth if hasattr(api_bp, 'auth_manager') and api_bp.auth_manager else login_required
@api_bp.auth_manager.require_admin if hasattr(api_bp, 'auth_manager') and api_bp.auth_manager else login_required
def reset_fixtures():
"""Reset all fixtures data (admin only) - clear matches, match_outcomes, and ZIP files"""
try:
from ..database.models import MatchModel, MatchOutcomeModel
from ..config.settings import get_user_data_dir
from pathlib import Path
import shutil
session = api_bp.db_manager.get_session()
try:
# Count existing data before reset
matches_count = session.query(MatchModel).count()
outcomes_count = session.query(MatchOutcomeModel).count()
# Clear all match outcomes first (due to foreign key constraints)
session.query(MatchOutcomeModel).delete()
session.commit()
# Clear all matches
session.query(MatchModel).delete()
session.commit()
# Clear ZIP files from persistent storage
zip_storage_dir = Path(get_user_data_dir()) / "zip_files"
zip_files_removed = 0
if zip_storage_dir.exists():
zip_files = list(zip_storage_dir.glob("*.zip"))
zip_files_removed = len(zip_files)
# Remove all ZIP files
for zip_file in zip_files:
try:
zip_file.unlink()
except Exception as e:
logger.warning(f"Failed to remove ZIP file {zip_file}: {e}")
logger.info(f"Removed {zip_files_removed} ZIP files from {zip_storage_dir}")
logger.info(f"Fixtures reset completed - Removed {matches_count} matches, {outcomes_count} outcomes, {zip_files_removed} ZIP files")
return jsonify({
"success": True,
"message": "Fixtures data reset successfully",
"removed": {
"matches": matches_count,
"outcomes": outcomes_count,
"zip_files": zip_files_removed
}
})
finally:
session.close()
except Exception as e:
logger.error(f"API fixtures reset error: {e}")
return jsonify({"error": str(e)}), 500
@api_bp.route('/api-client/status')
@api_bp.auth_manager.require_auth if hasattr(api_bp, 'auth_manager') and api_bp.auth_manager else login_required
def get_api_client_status():
"""Get API client status and endpoint information"""
try:
# Check if we can get API client component status via message bus
status_data = {
"api_client_running": False,
"fastapi_endpoint": {},
"message_bus_available": bool(api_bp.message_bus),
"config_manager_available": bool(api_bp.config_manager)
}
# Try to get current API configuration
if api_bp.config_manager:
try:
api_config = api_bp.config_manager.get_section_config("api") or {}
status_data["api_config"] = {
"fastapi_url": api_config.get("fastapi_url", ""),
"api_token_set": bool(api_config.get("api_token", "").strip()),
"api_interval": api_config.get("api_interval", 1800),
"api_enabled": api_config.get("api_enabled", True)
}
except Exception as e:
status_data["config_error"] = str(e)
# Try to get API endpoints configuration
if api_bp.config_manager:
try:
endpoints_config = api_bp.config_manager.get_section_config("api_endpoints") or {}
fastapi_endpoint = endpoints_config.get("fastapi_main", {})
if fastapi_endpoint:
status_data["fastapi_endpoint"] = {
"url": fastapi_endpoint.get("url", ""),
"enabled": fastapi_endpoint.get("enabled", False),
"interval": fastapi_endpoint.get("interval", 0),
"auth_configured": bool(fastapi_endpoint.get("auth", {}).get("token", "")),
"last_request": fastapi_endpoint.get("last_request"),
"last_success": fastapi_endpoint.get("last_success"),
"consecutive_failures": fastapi_endpoint.get("consecutive_failures", 0),
"total_requests": fastapi_endpoint.get("total_requests", 0)
}
except Exception as e:
status_data["endpoints_error"] = str(e)
return jsonify({
"success": True,
"status": status_data
})
except Exception as e:
logger.error(f"API client status error: {e}")
return jsonify({"error": str(e)}), 500
@api_bp.route('/api-client/trigger', methods=['POST'])
@api_bp.auth_manager.require_auth if hasattr(api_bp, 'auth_manager') and api_bp.auth_manager else login_required
@api_bp.auth_manager.require_admin if hasattr(api_bp, 'auth_manager') and api_bp.auth_manager else login_required
def trigger_api_request():
"""Manually trigger an API request for testing"""
try:
data = request.get_json() or {}
endpoint_name = data.get('endpoint', 'fastapi_main')
# Send manual API request message
if api_bp.message_bus:
from ..core.message_bus import MessageBuilder, MessageType
api_request_message = MessageBuilder.api_request(
sender="web_dashboard",
endpoint=endpoint_name
)
api_bp.message_bus.publish(api_request_message)
logger.info(f"Manual API request triggered for endpoint: {endpoint_name}")
return jsonify({
"success": True,
"message": f"API request triggered for endpoint: {endpoint_name}"
})
else:
return jsonify({
"success": False,
"error": "Message bus not available"
}), 500
except Exception as e:
logger.error(f"API request trigger error: {e}")
return jsonify({"error": str(e)}), 500
\ No newline at end of file
......@@ -44,6 +44,12 @@
<i class="fas fa-layer-group me-1"></i>Templates
</a>
</li>
<li class="nav-item">
<a class="nav-link {% if request.endpoint == 'main.fixtures' %}active{% endif %}"
href="{{ url_for('main.fixtures') }}">
<i class="fas fa-list-ul me-1"></i>Fixtures
</a>
</li>
<li class="nav-item">
<a class="nav-link {% if request.endpoint == 'main.video_test' %}active{% endif %}"
href="{{ url_for('main.video_test') }}">
......
......@@ -44,6 +44,11 @@
<label for="api-token" class="form-label">API Token</label>
<input type="password" class="form-control" id="api-token" placeholder="Enter API token">
</div>
<div class="mb-3">
<label for="api-interval" class="form-label">Request Interval (seconds)</label>
<input type="number" class="form-control" id="api-interval" placeholder="1800" value="{{ config.api_interval or 1800 }}" min="30" max="86400">
<div class="form-text">How often to request updates from the MbetterD daemon (30 seconds to 24 hours)</div>
</div>
<button type="submit" class="btn btn-primary">Save Configuration</button>
</form>
</div>
......@@ -112,7 +117,8 @@
const config = {
api_host: document.getElementById('api-host').value,
api_port: parseInt(document.getElementById('api-port').value),
api_token: document.getElementById('api-token').value
api_token: document.getElementById('api-token').value,
api_interval: parseInt(document.getElementById('api-interval').value)
};
fetch('/api/config', {
......
......@@ -87,6 +87,12 @@
placeholder="Enter your API access token">
<div class="form-text">Authentication token for FastAPI server access</div>
</div>
<div class="mb-3">
<label for="api-interval" class="form-label">Request Interval (seconds)</label>
<input type="number" class="form-control" id="api-interval"
value="{{ config.api_interval or 1800 }}" min="30" max="86400">
<div class="form-text">Time between automatic API requests (30 seconds to 24 hours)</div>
</div>
<div class="mb-3">
<label for="api-timeout" class="form-label">Request Timeout (seconds)</label>
<input type="number" class="form-control" id="api-timeout"
......@@ -124,6 +130,24 @@
</form>
</div>
</div>
<!-- API Client Debug Section -->
<div class="card">
<div class="card-header">
<h5>API Client Debug</h5>
</div>
<div class="card-body">
<div class="mb-3">
<button type="button" class="btn btn-info" id="check-api-status">
Check API Client Status
</button>
<button type="button" class="btn btn-warning ms-2" id="trigger-api-request">
Trigger Manual Request
</button>
</div>
<div id="api-status-result" class="mt-3"></div>
</div>
</div>
</div>
</div>
</div>
......@@ -173,6 +197,7 @@
const config = {
fastapi_url: document.getElementById('fastapi-url').value,
api_token: document.getElementById('api-token').value,
api_interval: parseInt(document.getElementById('api-interval').value),
api_timeout: parseInt(document.getElementById('api-timeout').value),
api_enabled: document.getElementById('api-enabled').checked
};
......@@ -243,5 +268,92 @@
alert('Failed to save ' + section + ' configuration');
});
}
// Check API client status
document.getElementById('check-api-status').addEventListener('click', function() {
this.disabled = true;
this.textContent = 'Checking...';
fetch('/api/api-client/status')
.then(response => response.json())
.then(data => {
const resultDiv = document.getElementById('api-status-result');
if (data.success) {
const status = data.status;
let html = '<div class="alert alert-info"><strong>API Client Status:</strong><br>';
html += '<strong>Message Bus:</strong> ' + (status.message_bus_available ? 'Available' : 'Not Available') + '<br>';
html += '<strong>Config Manager:</strong> ' + (status.config_manager_available ? 'Available' : 'Not Available') + '<br>';
if (status.api_config) {
html += '<strong>API Configuration:</strong><br>';
html += '&nbsp;&nbsp;URL: ' + (status.api_config.fastapi_url || 'Not set') + '<br>';
html += '&nbsp;&nbsp;Token: ' + (status.api_config.api_token_set ? 'Set' : 'Not set') + '<br>';
html += '&nbsp;&nbsp;Interval: ' + status.api_config.api_interval + ' seconds<br>';
html += '&nbsp;&nbsp;Enabled: ' + status.api_config.api_enabled + '<br>';
}
if (status.fastapi_endpoint && Object.keys(status.fastapi_endpoint).length > 0) {
const ep = status.fastapi_endpoint;
html += '<strong>FastAPI Endpoint:</strong><br>';
html += '&nbsp;&nbsp;URL: ' + (ep.url || 'Not set') + '<br>';
html += '&nbsp;&nbsp;Enabled: ' + ep.enabled + '<br>';
html += '&nbsp;&nbsp;Interval: ' + ep.interval + ' seconds<br>';
html += '&nbsp;&nbsp;Auth Configured: ' + ep.auth_configured + '<br>';
html += '&nbsp;&nbsp;Total Requests: ' + ep.total_requests + '<br>';
html += '&nbsp;&nbsp;Consecutive Failures: ' + ep.consecutive_failures + '<br>';
html += '&nbsp;&nbsp;Last Request: ' + (ep.last_request || 'Never') + '<br>';
html += '&nbsp;&nbsp;Last Success: ' + (ep.last_success || 'Never') + '<br>';
} else {
html += '<strong>FastAPI Endpoint:</strong> Not configured<br>';
}
html += '</div>';
resultDiv.innerHTML = html;
} else {
resultDiv.innerHTML = '<div class="alert alert-danger">Error: ' + data.error + '</div>';
}
})
.catch(error => {
document.getElementById('api-status-result').innerHTML =
'<div class="alert alert-danger">Error checking status: ' + error.message + '</div>';
})
.finally(() => {
this.disabled = false;
this.textContent = 'Check API Client Status';
});
});
// Trigger manual API request
document.getElementById('trigger-api-request').addEventListener('click', function() {
this.disabled = true;
this.textContent = 'Triggering...';
fetch('/api/api-client/trigger', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({endpoint: 'fastapi_main'})
})
.then(response => response.json())
.then(data => {
const resultDiv = document.getElementById('api-status-result');
if (data.success) {
resultDiv.innerHTML = '<div class="alert alert-success">' + data.message + '</div>';
} else {
resultDiv.innerHTML = '<div class="alert alert-danger">Error: ' + data.error + '</div>';
}
})
.catch(error => {
document.getElementById('api-status-result').innerHTML =
'<div class="alert alert-danger">Error triggering request: ' + error.message + '</div>';
})
.finally(() => {
this.disabled = false;
this.textContent = 'Trigger Manual Request';
});
});
</script>
{% endblock %}
\ No newline at end of file
{% extends "base.html" %}
{% block title %}{{ page_title }}{% endblock %}
{% block content %}
<div class="container-fluid">
<div class="row">
<div class="col-12">
<!-- Navigation -->
<nav aria-label="breadcrumb">
<ol class="breadcrumb">
<li class="breadcrumb-item"><a href="{{ url_for('main.fixtures') }}">Fixtures</a></li>
<li class="breadcrumb-item active" aria-current="page">Match Details</li>
</ol>
</nav>
<div class="d-flex justify-content-between align-items-center mb-4">
<h1><i class="fas fa-boxing me-2"></i>Fixture Details</h1>
<a href="{{ url_for('main.fixtures') }}" class="btn btn-outline-secondary">
<i class="fas fa-arrow-left me-1"></i>Back to Fixtures
</a>
</div>
<!-- Loading Spinner -->
<div id="loading" class="text-center my-5">
<div class="spinner-border text-primary" role="status">
<span class="visually-hidden">Loading...</span>
</div>
<p class="mt-2">Loading fixture details...</p>
</div>
<!-- Error Message -->
<div id="error-message" class="alert alert-danger" style="display: none;">
<i class="fas fa-exclamation-triangle me-2"></i>
<span id="error-text"></span>
</div>
<!-- Fixture Details Content -->
<div id="fixture-content" style="display: none;">
<!-- Match Information Card -->
<div class="row">
<div class="col-lg-8">
<div class="card mb-4">
<div class="card-header">
<h5><i class="fas fa-info-circle me-2"></i>Match Information</h5>
</div>
<div class="card-body">
<div class="row">
<div class="col-md-6">
<table class="table table-borderless">
<tr>
<td><strong>Match Number:</strong></td>
<td><span class="badge bg-primary fs-6" id="match-number"></span></td>
</tr>
<tr>
<td><strong>Fighter 1:</strong></td>
<td><span class="fw-bold text-primary" id="fighter1"></span></td>
</tr>
<tr>
<td><strong>Fighter 2:</strong></td>
<td><span class="fw-bold text-primary" id="fighter2"></span></td>
</tr>
<tr>
<td><strong>Venue:</strong></td>
<td><span id="venue"></span></td>
</tr>
<tr>
<td><strong>Fixture ID:</strong></td>
<td><small class="text-muted" id="fixture-id"></small></td>
</tr>
</table>
</div>
<div class="col-md-6">
<table class="table table-borderless">
<tr>
<td><strong>Status:</strong></td>
<td><span id="status-badge"></span></td>
</tr>
<tr>
<td><strong>Active:</strong></td>
<td><span id="active-status"></span></td>
</tr>
<tr>
<td><strong>Start Time:</strong></td>
<td><span id="start-time" class="text-muted">Not set</span></td>
</tr>
<tr>
<td><strong>End Time:</strong></td>
<td><span id="end-time" class="text-muted">Not set</span></td>
</tr>
<tr>
<td><strong>Result:</strong></td>
<td><span id="result" class="text-muted">Not available</span></td>
</tr>
</table>
</div>
</div>
</div>
</div>
<!-- Match Outcomes Card -->
<div class="card mb-4">
<div class="card-header d-flex justify-content-between align-items-center">
<h5><i class="fas fa-chart-line me-2"></i>Match Outcomes</h5>
<span id="outcomes-count" class="badge bg-secondary">0 outcomes</span>
</div>
<div class="card-body">
<div id="no-outcomes" class="text-center text-muted py-4">
<i class="fas fa-chart-line fa-3x mb-3"></i>
<h5>No Outcomes Available</h5>
<p>No outcome data has been synchronized for this match yet.</p>
</div>
<div id="outcomes-table-container" style="display: none;">
<div class="table-responsive">
<table class="table table-hover">
<thead class="table-light">
<tr>
<th>Column Name</th>
<th>Value</th>
<th>Updated</th>
</tr>
</thead>
<tbody id="outcomes-tbody">
<!-- Outcomes will be loaded here -->
</tbody>
</table>
</div>
</div>
</div>
</div>
</div>
<!-- Side Panel -->
<div class="col-lg-4">
<!-- Upload Status Card -->
<div class="card mb-4">
<div class="card-header">
<h5><i class="fas fa-cloud-upload-alt me-2"></i>Upload Status</h5>
</div>
<div class="card-body">
<div class="mb-3">
<label class="form-label">ZIP Upload Status</label>
<div id="upload-status-badge"></div>
</div>
<div class="mb-3" id="progress-container" style="display: none;">
<label class="form-label">Upload Progress</label>
<div class="progress">
<div class="progress-bar" role="progressbar" id="progress-bar" style="width: 0%">0%</div>
</div>
</div>
<div class="mb-3" id="zip-file-info" style="display: none;">
<label class="form-label">ZIP File</label>
<div class="small text-muted">
<div><strong>Filename:</strong> <span id="zip-filename"></span></div>
<div><strong>SHA1:</strong> <span id="zip-sha1sum" class="font-monospace"></span></div>
</div>
</div>
</div>
</div>
<!-- File Information Card -->
<div class="card mb-4">
<div class="card-header">
<h5><i class="fas fa-file me-2"></i>File Information</h5>
</div>
<div class="card-body">
<div class="mb-3">
<label class="form-label">Fixture File</label>
<div class="small text-muted">
<div><strong>Filename:</strong> <span id="filename"></span></div>
<div><strong>SHA1:</strong> <span id="file-sha1sum" class="font-monospace"></span></div>
</div>
</div>
</div>
</div>
<!-- Timestamps Card -->
<div class="card">
<div class="card-header">
<h5><i class="fas fa-clock me-2"></i>Timestamps</h5>
</div>
<div class="card-body">
<div class="mb-2">
<label class="form-label small">Created</label>
<div class="text-muted small" id="created-at"></div>
</div>
<div>
<label class="form-label small">Last Updated</label>
<div class="text-muted small" id="updated-at"></div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
</div>
<script id="fixture-config" type="application/json">
{
"matchId": {{ match_id | tojson }}
}
</script>
<script>
const config = JSON.parse(document.getElementById('fixture-config').textContent);
const matchId = config.matchId;
// Load fixture details on page load
document.addEventListener('DOMContentLoaded', function() {
loadFixtureDetails();
});
function loadFixtureDetails() {
const loading = document.getElementById('loading');
const errorMessage = document.getElementById('error-message');
const content = document.getElementById('fixture-content');
loading.style.display = 'block';
errorMessage.style.display = 'none';
content.style.display = 'none';
fetch(`/api/fixtures/${matchId}`)
.then(response => response.json())
.then(data => {
if (data.success) {
renderFixtureDetails(data.match);
content.style.display = 'block';
} else {
showError(data.error || 'Failed to load fixture details');
}
})
.catch(error => {
console.error('Error:', error);
showError('Network error: ' + error.message);
})
.finally(() => {
loading.style.display = 'none';
});
}
function showError(message) {
document.getElementById('error-text').textContent = message;
document.getElementById('error-message').style.display = 'block';
}
function renderFixtureDetails(match) {
// Basic match information
document.getElementById('match-number').textContent = '#' + match.match_number;
document.getElementById('fighter1').textContent = match.fighter1_township;
document.getElementById('fighter2').textContent = match.fighter2_township;
document.getElementById('venue').textContent = match.venue_kampala_township;
document.getElementById('fixture-id').textContent = match.fixture_id;
// Status information
document.getElementById('status-badge').innerHTML = getStatusBadge(match);
document.getElementById('active-status').innerHTML = match.active_status
? '<span class="badge bg-success">Active</span>'
: '<span class="badge bg-secondary">Inactive</span>';
// Times and result
if (match.start_time) {
document.getElementById('start-time').textContent = new Date(match.start_time).toLocaleString();
document.getElementById('start-time').classList.remove('text-muted');
}
if (match.end_time) {
document.getElementById('end-time').textContent = new Date(match.end_time).toLocaleString();
document.getElementById('end-time').classList.remove('text-muted');
}
if (match.result) {
document.getElementById('result').textContent = match.result;
document.getElementById('result').classList.remove('text-muted');
}
// File information
document.getElementById('filename').textContent = match.filename;
document.getElementById('file-sha1sum').textContent = match.file_sha1sum;
// Upload status
renderUploadStatus(match);
// Outcomes
renderOutcomes(match.outcomes || []);
// Timestamps
document.getElementById('created-at').textContent = new Date(match.created_at).toLocaleString();
document.getElementById('updated-at').textContent = new Date(match.updated_at).toLocaleString();
}
function renderUploadStatus(match) {
const uploadBadge = document.getElementById('upload-status-badge');
const progressContainer = document.getElementById('progress-container');
const zipFileInfo = document.getElementById('zip-file-info');
uploadBadge.innerHTML = getUploadStatusBadge(match);
// Show progress bar if uploading
if (match.zip_upload_status === 'uploading') {
const progress = match.zip_upload_progress || 0;
const progressBar = document.getElementById('progress-bar');
progressBar.style.width = progress + '%';
progressBar.textContent = progress.toFixed(1) + '%';
progressContainer.style.display = 'block';
}
// Show ZIP file info if available
if (match.zip_filename) {
document.getElementById('zip-filename').textContent = match.zip_filename;
if (match.zip_sha1sum) {
document.getElementById('zip-sha1sum').textContent = match.zip_sha1sum;
}
zipFileInfo.style.display = 'block';
}
}
function renderOutcomes(outcomes) {
const noOutcomes = document.getElementById('no-outcomes');
const outcomesContainer = document.getElementById('outcomes-table-container');
const outcomesCount = document.getElementById('outcomes-count');
const tbody = document.getElementById('outcomes-tbody');
outcomesCount.textContent = outcomes.length + ' outcomes';
if (outcomes.length === 0) {
noOutcomes.style.display = 'block';
outcomesContainer.style.display = 'none';
return;
}
noOutcomes.style.display = 'none';
outcomesContainer.style.display = 'block';
tbody.innerHTML = '';
outcomes.forEach(outcome => {
const row = document.createElement('tr');
row.innerHTML = `
<td><span class="badge bg-light text-dark">${outcome.column_name}</span></td>
<td><strong class="text-primary">${outcome.float_value}</strong></td>
<td><small class="text-muted">${new Date(outcome.updated_at).toLocaleString()}</small></td>
`;
tbody.appendChild(row);
});
}
function getStatusBadge(fixture) {
if (fixture.done) {
return '<span class="badge bg-success"><i class="fas fa-check me-1"></i>Completed</span>';
} else if (fixture.running) {
return '<span class="badge bg-info"><i class="fas fa-play me-1"></i>Running</span>';
} else {
return '<span class="badge bg-warning"><i class="fas fa-clock me-1"></i>Pending</span>';
}
}
function getUploadStatusBadge(fixture) {
const status = fixture.zip_upload_status || 'pending';
const progress = fixture.zip_upload_progress || 0;
switch (status) {
case 'completed':
return '<span class="badge bg-success"><i class="fas fa-check me-1"></i>Completed</span>';
case 'uploading':
return `<span class="badge bg-info"><i class="fas fa-spinner fa-spin me-1"></i>Uploading (${progress.toFixed(1)}%)</span>`;
case 'failed':
return '<span class="badge bg-danger"><i class="fas fa-times me-1"></i>Failed</span>';
default:
return '<span class="badge bg-secondary"><i class="fas fa-clock me-1"></i>Pending</span>';
}
}
</script>
{% endblock %}
\ No newline at end of file
{% extends "base.html" %}
{% block title %}{{ page_title }}{% endblock %}
{% block content %}
<div class="container-fluid">
<div class="row">
<div class="col-12">
<div class="d-flex justify-content-between align-items-center mb-3">
<div>
<h1><i class="fas fa-list-ul me-2"></i>Fixtures</h1>
<p class="mb-0">View and manage synchronized boxing match fixtures.</p>
</div>
{% if current_user.is_admin %}
<div>
<button type="button" class="btn btn-danger" id="reset-fixtures-btn">
<i class="fas fa-trash-alt me-1"></i>Reset All Fixtures
</button>
</div>
{% endif %}
</div>
<!-- Filters and Search -->
<div class="card mb-4">
<div class="card-header">
<h5>Filter Options</h5>
</div>
<div class="card-body">
<div class="row">
<div class="col-md-3">
<label for="status-filter" class="form-label">Status</label>
<select class="form-select" id="status-filter">
<option value="">All Status</option>
<option value="pending">Pending</option>
<option value="running">Running</option>
<option value="done">Completed</option>
</select>
</div>
<div class="col-md-3">
<label for="upload-filter" class="form-label">Upload Status</label>
<select class="form-select" id="upload-filter">
<option value="">All Uploads</option>
<option value="pending">Pending</option>
<option value="uploading">Uploading</option>
<option value="completed">Completed</option>
<option value="failed">Failed</option>
</select>
</div>
<div class="col-md-4">
<label for="search-input" class="form-label">Search Fighters</label>
<input type="text" class="form-control" id="search-input" placeholder="Search by fighter names or venue">
</div>
<div class="col-md-2 d-flex align-items-end">
<button class="btn btn-primary" id="refresh-btn">
<i class="fas fa-sync-alt me-1"></i>Refresh
</button>
</div>
</div>
</div>
</div>
<!-- Loading Spinner -->
<div id="loading" class="text-center my-4" style="display: none;">
<div class="spinner-border text-primary" role="status">
<span class="visually-hidden">Loading...</span>
</div>
<p class="mt-2">Loading fixtures...</p>
</div>
<!-- Fixtures Summary Cards -->
<div class="row mb-4" id="summary-cards">
<div class="col-md-3">
<div class="card text-center">
<div class="card-body">
<h5 class="card-title text-primary">
<i class="fas fa-list me-2"></i>Total Fixtures
</h5>
<h3 id="total-count" class="text-primary">0</h3>
</div>
</div>
</div>
<div class="col-md-3">
<div class="card text-center">
<div class="card-body">
<h5 class="card-title text-warning">
<i class="fas fa-clock me-2"></i>Pending
</h5>
<h3 id="pending-count" class="text-warning">0</h3>
</div>
</div>
</div>
<div class="col-md-3">
<div class="card text-center">
<div class="card-body">
<h5 class="card-title text-info">
<i class="fas fa-play me-2"></i>Running
</h5>
<h3 id="running-count" class="text-info">0</h3>
</div>
</div>
</div>
<div class="col-md-3">
<div class="card text-center">
<div class="card-body">
<h5 class="card-title text-success">
<i class="fas fa-check me-2"></i>Completed
</h5>
<h3 id="completed-count" class="text-success">0</h3>
</div>
</div>
</div>
</div>
<!-- Fixtures Table -->
<div class="card">
<div class="card-header d-flex justify-content-between align-items-center">
<h5>All Fixtures</h5>
<span id="filtered-count" class="badge bg-secondary">0 fixtures</span>
</div>
<div class="card-body p-0">
<div class="table-responsive">
<table class="table table-hover mb-0" id="fixtures-table">
<thead class="table-light">
<tr>
<th>Match #</th>
<th>Fighters</th>
<th>Venue</th>
<th>Status</th>
<th>Upload Status</th>
<th>Created</th>
<th>Actions</th>
</tr>
</thead>
<tbody id="fixtures-tbody">
<!-- Table content will be loaded here -->
</tbody>
</table>
</div>
</div>
</div>
<!-- Empty State -->
<div id="empty-state" class="text-center my-5" style="display: none;">
<i class="fas fa-inbox fa-4x text-muted mb-3"></i>
<h4 class="text-muted">No Fixtures Found</h4>
<p class="text-muted">No synchronized fixtures found. Check your API synchronization settings.</p>
</div>
</div>
</div>
</div>
<script>
let allFixtures = [];
// Load fixtures on page load
document.addEventListener('DOMContentLoaded', function() {
loadFixtures();
// Event listeners
document.getElementById('refresh-btn').addEventListener('click', loadFixtures);
document.getElementById('status-filter').addEventListener('change', filterFixtures);
document.getElementById('upload-filter').addEventListener('change', filterFixtures);
document.getElementById('search-input').addEventListener('input', filterFixtures);
// Reset fixtures button (admin only)
const resetBtn = document.getElementById('reset-fixtures-btn');
if (resetBtn) {
resetBtn.addEventListener('click', resetFixtures);
}
});
function loadFixtures() {
const loading = document.getElementById('loading');
const refreshBtn = document.getElementById('refresh-btn');
loading.style.display = 'block';
refreshBtn.disabled = true;
fetch('/api/fixtures')
.then(response => response.json())
.then(data => {
if (data.success) {
allFixtures = data.fixtures;
updateSummaryCards();
filterFixtures(); // This will also render the table
} else {
alert('Error loading fixtures: ' + (data.error || 'Unknown error'));
}
})
.catch(error => {
console.error('Error:', error);
alert('Failed to load fixtures: ' + error.message);
})
.finally(() => {
loading.style.display = 'none';
refreshBtn.disabled = false;
});
}
function updateSummaryCards() {
const totalCount = allFixtures.length;
const pendingCount = allFixtures.filter(f => !f.running && !f.done).length;
const runningCount = allFixtures.filter(f => f.running && !f.done).length;
const completedCount = allFixtures.filter(f => f.done).length;
document.getElementById('total-count').textContent = totalCount;
document.getElementById('pending-count').textContent = pendingCount;
document.getElementById('running-count').textContent = runningCount;
document.getElementById('completed-count').textContent = completedCount;
}
function filterFixtures() {
const statusFilter = document.getElementById('status-filter').value;
const uploadFilter = document.getElementById('upload-filter').value;
const searchTerm = document.getElementById('search-input').value.toLowerCase();
let filteredFixtures = allFixtures.filter(fixture => {
// Status filter
if (statusFilter) {
if (statusFilter === 'pending' && (fixture.running || fixture.done)) return false;
if (statusFilter === 'running' && (!fixture.running || fixture.done)) return false;
if (statusFilter === 'done' && !fixture.done) return false;
}
// Upload filter
if (uploadFilter && fixture.zip_upload_status !== uploadFilter) {
return false;
}
// Search filter
if (searchTerm) {
const searchText = (fixture.fighter1_township + ' ' + fixture.fighter2_township + ' ' + fixture.venue_kampala_township).toLowerCase();
if (!searchText.includes(searchTerm)) return false;
}
return true;
});
renderFixturesTable(filteredFixtures);
document.getElementById('filtered-count').textContent = filteredFixtures.length + ' fixtures';
// Show/hide empty state
const emptyState = document.getElementById('empty-state');
const fixturesTable = document.querySelector('.card .table-responsive').parentElement;
if (filteredFixtures.length === 0 && allFixtures.length === 0) {
emptyState.style.display = 'block';
fixturesTable.style.display = 'none';
} else {
emptyState.style.display = 'none';
fixturesTable.style.display = 'block';
}
}
function renderFixturesTable(fixtures) {
const tbody = document.getElementById('fixtures-tbody');
tbody.innerHTML = '';
fixtures.forEach(fixture => {
const row = document.createElement('tr');
row.innerHTML = `
<td><strong>#${fixture.match_number}</strong></td>
<td>
<div class="fw-bold">${fixture.fighter1_township}</div>
<small class="text-muted">vs</small>
<div class="fw-bold">${fixture.fighter2_township}</div>
</td>
<td>${fixture.venue_kampala_township}</td>
<td>${getStatusBadge(fixture)}</td>
<td>${getUploadStatusBadge(fixture)}</td>
<td>
<small class="text-muted">
${new Date(fixture.created_at).toLocaleString()}
</small>
</td>
<td>
<a href="/fixtures/${fixture.id}" class="btn btn-sm btn-outline-primary">
<i class="fas fa-eye me-1"></i>Details
</a>
</td>
`;
tbody.appendChild(row);
});
}
function getStatusBadge(fixture) {
if (fixture.done) {
return '<span class="badge bg-success"><i class="fas fa-check me-1"></i>Completed</span>';
} else if (fixture.running) {
return '<span class="badge bg-info"><i class="fas fa-play me-1"></i>Running</span>';
} else {
return '<span class="badge bg-warning"><i class="fas fa-clock me-1"></i>Pending</span>';
}
}
function getUploadStatusBadge(fixture) {
const status = fixture.zip_upload_status || 'pending';
const progress = fixture.zip_upload_progress || 0;
switch (status) {
case 'completed':
return '<span class="badge bg-success"><i class="fas fa-check me-1"></i>Completed</span>';
case 'uploading':
return `<span class="badge bg-info"><i class="fas fa-spinner fa-spin me-1"></i>Uploading (${progress.toFixed(1)}%)</span>`;
case 'failed':
return '<span class="badge bg-danger"><i class="fas fa-times me-1"></i>Failed</span>';
default:
return '<span class="badge bg-secondary"><i class="fas fa-clock me-1"></i>Pending</span>';
}
}
function resetFixtures() {
const confirmMessage = 'WARNING: This will permanently delete ALL fixture data including:\n\n' +
'• All synchronized matches and outcomes\n' +
'• All downloaded ZIP files\n' +
'• This action cannot be undone!\n\n' +
'Are you sure you want to reset all fixtures data?';
if (!confirm(confirmMessage)) {
return;
}
const resetBtn = document.getElementById('reset-fixtures-btn');
const originalText = resetBtn.innerHTML;
resetBtn.disabled = true;
resetBtn.innerHTML = '<i class="fas fa-spinner fa-spin me-1"></i>Resetting...';
fetch('/api/fixtures/reset', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
}
})
.then(response => response.json())
.then(data => {
if (data.success) {
alert(`Fixtures reset successfully!\n\nRemoved:\n• ${data.removed.matches} matches\n• ${data.removed.outcomes} outcomes\n• ${data.removed.zip_files} ZIP files`);
// Reload fixtures to show empty state
loadFixtures();
} else {
alert('Error resetting fixtures: ' + (data.error || 'Unknown error'));
}
})
.catch(error => {
console.error('Error:', error);
alert('Failed to reset fixtures: ' + error.message);
})
.finally(() => {
resetBtn.disabled = false;
resetBtn.innerHTML = originalText;
});
}
</script>
{% endblock %}
\ No newline at end of file
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment