6 Commits

Author SHA1 Message Date
Javier
2a649fdbcc V0.15.0 - Not done yet 2026-02-01 16:22:59 -06:00
Javier
89be88566f Excel Template working better, still not finished. 2026-02-01 01:35:02 -06:00
Javier
1359e036d5 Update 2026-01-31 22:20:10 -06:00
Javier
ad071438cc Merge branch 'Refractor--Changing-how-counting-works' 2026-01-31 20:32:18 -06:00
Javier
5604686630 update: added files to gitignore 2026-01-31 20:30:12 -06:00
Javier
2d333c16a3 v0.14.0 - Major Logic Overhaul & Real-Time Dashboard
Logic: Implemented "One User, One Bin" locking to prevent duplicate counting.

    Integrity: Standardized is_deleted = 0 and tightened "Matched" criteria to require zero weight variance.

    Refresh: Added silent 30-second dashboard polling for all 6 status categories and active counter list.

    Tracking: Built user-specific activity tracking to identify who is counting where in real-time.

    Stability: Resolved persistent 500 errors by finalizing the active-counters-fragment structure.
2026-01-31 19:17:36 -06:00
19 changed files with 827 additions and 325 deletions

View File

@@ -4,20 +4,7 @@ You are helping build a project called **Scanlook**.
## Scanlook (current product summary)
Scanlook is a web app for warehouse counting workflows.
- Admin creates a **Count Session** (e.g., “Jan 24 2026 - First Shift”) and uploads a **Master Inventory list**.
- Staff select the active Count Session, enter a **Location/BIN**, and the app shows the **Expected** lots/items/weights that should be there (Cycle Count mode).
- Staff **scan lot numbers**, enter **weights**, and each scan moves from **Expected → Scanned**.
- System flags:
- duplicates
- wrong location
- “ghost” lots (physically found but not in system/master list)
- Staff can **Finalize** a BIN; once finalized, it should clearly report **missing items/lots**.
- Admin sees live progress in an **Admin Dashboard**.
- Multiple Count Sessions can exist even on the same day (e.g., First Shift vs Second Shift) and must be completely isolated.
There are two types of counts:
1) **Cycle Count**: shows Expected list for the BIN.
2) **Physical Inventory**: same workflow but **blind** (does NOT show Expected list; only scanned results, then missing is determined after).
Scanlook is modular.
Long-term goal: evolve into a WMS, but right now focus on making this workflow reliable.
@@ -44,7 +31,7 @@ Long-term goal: evolve into a WMS, but right now focus on making this workflow r
## Scanlook (current product summary)
Scanlook is a web app for warehouse counting workflows built with Flask + SQLite.
**Current Version:** 0.13.0
**Current Version:** 0.14.0
**Tech Stack:**
- Backend: Python/Flask, raw SQL (no ORM), openpyxl (Excel file generation)
@@ -79,13 +66,9 @@ Scanlook is a web app for warehouse counting workflows built with Flask + SQLite
- Consumption Sheets module (production lot tracking with Excel export)
- Database migration system (auto-applies schema changes on startup)
**Two count types:**
1. Cycle Count: shows Expected list for the BIN
2. Physical Inventory: blind count (no Expected list shown)
**Long-term goal:** Modular WMS with future modules for Shipping, Receiving, Transfers, Production.
**Module System (v0.13.0):**
**Module System (v0.14.0):**
- Modules table defines available modules (module_key used for routing)
- UserModules table tracks per-user access
- Home page (/home) shows module cards based on user's access
@@ -104,5 +87,5 @@ Scanlook is a web app for warehouse counting workflows built with Flask + SQLite
- Scanner viewport: 320px wide (MC9300)
- Mobile breakpoint: 360-767px
- Desktop: 768px+
- Git remote: http://10.44.44.33:3000/stuff/ScanLook.git
- Git remote: https://tsngit.tsnx.net/stuff/ScanLook.git
- Docker registry: 10.44.44.33:3000/stuff/scanlook

2
app.py
View File

@@ -38,7 +38,7 @@ app.config['PERMANENT_SESSION_LIFETIME'] = timedelta(hours=1)
# 1. Define the version
APP_VERSION = '0.13.2'
APP_VERSION = '0.15.0'
# 2. Inject it into all templates automatically
@app.context_processor

View File

@@ -29,35 +29,6 @@ def reopen_location(location_count_id):
return jsonify({'success': True, 'message': 'Bin reopened for counting'})
@admin_locations_bp.route('/location/<int:location_count_id>/delete', methods=['POST'])
@login_required
def delete_location_count(location_count_id):
"""Delete all counts for a location (soft delete)"""
# Verify ownership
loc = query_db('SELECT * FROM LocationCounts WHERE location_count_id = ?', [location_count_id], one=True)
if not loc:
return jsonify({'success': False, 'message': 'Location not found'})
if loc['counted_by'] != session['user_id'] and session['role'] not in ['owner', 'admin']:
return jsonify({'success': False, 'message': 'Permission denied'})
# Soft delete all scan entries for this location
execute_db('''
UPDATE ScanEntries
SET is_deleted = 1
WHERE location_count_id = ?
''', [location_count_id])
# Delete the location count record
execute_db('''
DELETE FROM LocationCounts
WHERE location_count_id = ?
''', [location_count_id])
return jsonify({'success': True, 'message': 'Bin count deleted'})
@admin_locations_bp.route('/location/<int:location_count_id>/scans')
@login_required
def get_location_scans(location_count_id):
@@ -87,3 +58,39 @@ def get_location_scans(location_count_id):
except Exception as e:
return jsonify({'success': False, 'message': str(e)})
@admin_locations_bp.route('/location/<int:location_count_id>/delete', methods=['POST'])
@login_required
def soft_delete_location(location_count_id):
"""Admin-only: Soft delete a bin count and its associated data"""
if session.get('role') not in ['owner', 'admin']:
return jsonify({'success': False, 'message': 'Admin role required'}), 403
# 1. Verify location exists
loc = query_db('SELECT session_id, location_name FROM LocationCounts WHERE location_count_id = ?',
[location_count_id], one=True)
if not loc:
return jsonify({'success': False, 'message': 'Location not found'})
# 2. Soft delete the bin count itself
execute_db('''
UPDATE LocationCounts
SET is_deleted = 1
WHERE location_count_id = ?
''', [location_count_id])
# 3. Soft delete all scans in that bin
execute_db('''
UPDATE ScanEntries
SET is_deleted = 1
WHERE location_count_id = ?
''', [location_count_id])
# 4. Remove any MissingLots records generated for this bin
execute_db('''
DELETE FROM MissingLots
WHERE session_id = ? AND master_expected_location = ?
''', [loc['session_id'], loc['location_name']])
return jsonify({'success': True, 'message': 'Bin count and associated data soft-deleted'})

View File

@@ -8,18 +8,23 @@ cons_sheets_bp = Blueprint('cons_sheets', __name__)
@cons_sheets_bp.route('/admin/consumption-sheets')
@role_required('owner', 'admin')
def admin_processes():
"""List all consumption sheet process types"""
processes = query_db('''
SELECT cp.*, u.full_name as created_by_name,
(SELECT COUNT(*) FROM cons_process_fields
WHERE process_id = cp.id AND is_active = 1) as field_count
FROM cons_processes cp
LEFT JOIN Users u ON cp.created_by = u.user_id
WHERE cp.is_active = 1
ORDER BY cp.process_name
''')
"""List all consumption sheet process types (Active or Archived)"""
show_archived = request.args.get('archived') == '1'
is_active_val = 0 if show_archived else 1
return render_template('cons_sheets/admin_processes.html', processes=processes)
processes = query_db('''
SELECT cp.*,
u.full_name as created_by_name,
(SELECT COUNT(*) FROM cons_process_fields WHERE process_id = cp.id) as field_count
FROM cons_processes cp
LEFT JOIN users u ON cp.created_by = u.user_id
WHERE cp.is_active = ?
ORDER BY cp.process_name ASC
''', [is_active_val])
return render_template('cons_sheets/admin_processes.html',
processes=processes,
showing_archived=show_archived)
@cons_sheets_bp.route('/admin/consumption-sheets/create', methods=['GET', 'POST'])
@@ -144,6 +149,36 @@ def rename_column_in_detail_table(process_key, old_name, new_name):
conn.close()
@cons_sheets_bp.route('/admin/consumption-sheets/<int:process_id>/delete', methods=['POST'])
@role_required('owner', 'admin')
def delete_process(process_id):
"""Soft-delete a process type (Archive it)"""
# Check if process exists
process = query_db('SELECT * FROM cons_processes WHERE id = ?', [process_id], one=True)
if not process:
flash('Process not found', 'danger')
return redirect(url_for('cons_sheets.admin_processes'))
# Soft delete: Set is_active = 0
# The existing admin_processes route already filters for is_active=1,
# so this will effectively hide it from the list.
execute_db('UPDATE cons_processes SET is_active = 0 WHERE id = ?', [process_id])
flash(f'Process "{process["process_name"]}" has been deleted.', 'success')
return redirect(url_for('cons_sheets.admin_processes'))
@cons_sheets_bp.route('/admin/consumption-sheets/<int:process_id>/restore', methods=['POST'])
@role_required('owner', 'admin')
def restore_process(process_id):
"""Restore a soft-deleted process type"""
execute_db('UPDATE cons_processes SET is_active = 1 WHERE id = ?', [process_id])
flash('Process has been restored.', 'success')
return redirect(url_for('cons_sheets.admin_processes', archived=1))
@cons_sheets_bp.route('/admin/consumption-sheets/<int:process_id>')
@role_required('owner', 'admin')
def process_detail(process_id):
@@ -284,24 +319,35 @@ def update_template_settings(process_id):
rows_per_page = request.form.get('rows_per_page', 30)
detail_start_row = request.form.get('detail_start_row', 10)
page_height = request.form.get('page_height')
print_start_col = request.form.get('print_start_col', 'A').strip().upper()
print_end_col = request.form.get('print_end_col', '').strip().upper()
try:
rows_per_page = int(rows_per_page)
detail_start_row = int(detail_start_row)
# We enforce page_height is required now
page_height = int(page_height) if page_height and page_height.strip() else None
if not page_height:
flash('Page Height is required for the new strategy', 'danger')
return redirect(url_for('cons_sheets.process_template', process_id=process_id))
except ValueError:
flash('Invalid number values', 'danger')
return redirect(url_for('cons_sheets.process_template', process_id=process_id))
# Update query - We ignore detail_end_row (leave it as is or null)
execute_db('''
UPDATE cons_processes
SET rows_per_page = ?, detail_start_row = ?
SET rows_per_page = ?, detail_start_row = ?, page_height = ?,
print_start_col = ?, print_end_col = ?
WHERE id = ?
''', [rows_per_page, detail_start_row, process_id])
''', [rows_per_page, detail_start_row, page_height, print_start_col, print_end_col, process_id])
flash('Settings updated successfully!', 'success')
return redirect(url_for('cons_sheets.process_template', process_id=process_id))
@cons_sheets_bp.route('/admin/consumption-sheets/<int:process_id>/template/download')
@role_required('owner', 'admin')
def download_template(process_id):
@@ -905,22 +951,59 @@ def archive_session(session_id):
return jsonify({'success': True})
# --- BULK IMPORT ROUTES ---
@cons_sheets_bp.route('/cons-sheets/session/<int:session_id>/export')
@cons_sheets_bp.route('/cons-sheets/session/<int:session_id>/template')
@login_required
def export_session(session_id):
"""Export session to Excel using the process template"""
from flask import Response
def download_import_template(session_id):
"""Generate a blank Excel template for bulk import"""
from flask import Response # <--- ADDED THIS
from io import BytesIO
import openpyxl
from openpyxl.utils import get_column_letter, column_index_from_string
from copy import copy
from datetime import datetime
# Get session with process info
# Get Process ID
sess = query_db('SELECT process_id FROM cons_sessions WHERE id = ?', [session_id], one=True)
if not sess: return redirect(url_for('cons_sheets.index'))
# Get Detail Fields
fields = query_db('''
SELECT field_name, field_label
FROM cons_process_fields
WHERE process_id = ? AND table_type = 'detail' AND is_active = 1
ORDER BY sort_order
''', [sess['process_id']])
# Create Workbook
wb = openpyxl.Workbook()
ws = wb.active
ws.title = "Import Data"
# Write Header Row (Field Names)
headers = [f['field_name'] for f in fields]
ws.append(headers)
output = BytesIO()
wb.save(output)
output.seek(0)
return Response(
output.getvalue(),
mimetype='application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',
headers={'Content-Disposition': 'attachment; filename=import_template.xlsx'}
)
@cons_sheets_bp.route('/cons-sheets/session/<int:session_id>/import', methods=['POST'])
@login_required
def import_session_data(session_id):
"""Bulk import detail rows from Excel"""
# Import EVERYTHING locally to avoid NameErrors
import openpyxl
from datetime import datetime
from flask import request, flash, redirect, url_for, session
# 1. Get Session Info
sess = query_db('''
SELECT cs.*, cp.process_name, cp.process_key, cp.id as process_id,
cp.template_file, cp.template_filename, cp.rows_per_page, cp.detail_start_row
SELECT cs.*, cp.process_key
FROM cons_sessions cs
JOIN cons_processes cp ON cs.process_id = cp.id
WHERE cs.id = ?
@@ -930,11 +1013,124 @@ def export_session(session_id):
flash('Session not found', 'danger')
return redirect(url_for('cons_sheets.index'))
if not sess['template_file']:
flash('No template configured for this process', 'danger')
# 2. Check File
if 'file' not in request.files:
flash('No file uploaded', 'danger')
return redirect(url_for('cons_sheets.scan_session', session_id=session_id))
# Get header fields and values
file = request.files['file']
if file.filename == '':
flash('No file selected', 'danger')
return redirect(url_for('cons_sheets.scan_session', session_id=session_id))
try:
# 3. Read Excel
wb = openpyxl.load_workbook(file)
ws = wb.active
# Get headers from first row
headers = [cell.value for cell in ws[1]]
# Get valid field names for this process
valid_fields = query_db('''
SELECT field_name
FROM cons_process_fields
WHERE process_id = ? AND table_type = 'detail' AND is_active = 1
''', [sess['process_id']])
valid_field_names = [f['field_name'] for f in valid_fields]
# Map Excel Columns to DB Fields
col_mapping = {}
for idx, header in enumerate(headers):
if header and header in valid_field_names:
col_mapping[idx] = header
if not col_mapping:
flash('Error: No matching columns found in Excel. Please use the template.', 'danger')
return redirect(url_for('cons_sheets.scan_session', session_id=session_id))
# 4. Process Rows
table_name = f"cons_proc_{sess['process_key']}_details"
rows_inserted = 0
# Get User ID safely from session
user_id = session.get('user_id')
for row in ws.iter_rows(min_row=2, values_only=True):
if not any(row): continue
data = {}
for col_idx, value in enumerate(row):
if col_idx in col_mapping:
data[col_mapping[col_idx]] = value
if not data: continue
# Add Metadata
data['session_id'] = session_id
data['scanned_at'] = datetime.now().strftime('%Y-%m-%d %H:%M:%S')
data['scanned_by'] = user_id
# REMOVED: data['is_valid'] = 1 (This column does not exist)
data['is_deleted'] = 0
# Dynamic Insert SQL
columns = ', '.join(data.keys())
placeholders = ', '.join(['?'] * len(data))
values = list(data.values())
sql = f"INSERT INTO {table_name} ({columns}) VALUES ({placeholders})"
execute_db(sql, values)
rows_inserted += 1
flash(f'Successfully imported {rows_inserted} records!', 'success')
except Exception as e:
# This will catch any other errors and show them to you
flash(f'Import Error: {str(e)}', 'danger')
print(f"DEBUG IMPORT ERROR: {str(e)}") # Print to console for good measure
return redirect(url_for('cons_sheets.scan_session', session_id=session_id))
@cons_sheets_bp.route('/cons-sheets/session/<int:session_id>/export')
@login_required
def export_session(session_id):
"""Export session: Hide Rows Strategy + Manual Column Widths"""
from flask import Response
from io import BytesIO
import openpyxl
# Correct imports for newer openpyxl
from openpyxl.utils.cell import coordinate_from_string, get_column_letter
from openpyxl.worksheet.pagebreak import Break
from datetime import datetime
import math
# --- FIX 1: Update SQL to fetch the new columns ---
sess = query_db('''
SELECT cs.*, cp.process_name, cp.process_key, cp.id as process_id,
cp.template_file, cp.template_filename,
cp.rows_per_page, cp.detail_start_row, cp.page_height,
cp.print_start_col, cp.print_end_col
FROM cons_sessions cs
JOIN cons_processes cp ON cs.process_id = cp.id
WHERE cs.id = ?
''', [session_id], one=True)
if not sess or not sess['template_file']:
flash('Session or Template not found', 'danger')
return redirect(url_for('cons_sheets.index'))
# Validation
page_height = sess['page_height']
rows_per_page = sess['rows_per_page'] or 30
detail_start_row = sess['detail_start_row'] or 10
if not page_height:
flash('Configuration Error: Page Height is not set.', 'danger')
return redirect(url_for('cons_sheets.scan_session', session_id=session_id))
# Get Data
header_fields = query_db('''
SELECT cpf.field_name, cpf.excel_cell, cshv.field_value
FROM cons_process_fields cpf
@@ -942,7 +1138,6 @@ def export_session(session_id):
WHERE cpf.process_id = ? AND cpf.table_type = 'header' AND cpf.is_active = 1 AND cpf.excel_cell IS NOT NULL
''', [session_id, sess['process_id']])
# Get detail fields with their column mappings
detail_fields = query_db('''
SELECT field_name, excel_cell, field_type
FROM cons_process_fields
@@ -950,169 +1145,94 @@ def export_session(session_id):
ORDER BY sort_order, id
''', [sess['process_id']])
# Get all scanned details
table_name = get_detail_table_name(sess['process_key'])
table_name = f'cons_proc_{sess["process_key"]}_details'
scans = query_db(f'''
SELECT * FROM {table_name}
WHERE session_id = ? AND is_deleted = 0
ORDER BY scanned_at ASC
''', [session_id])
# Load the template
template_bytes = BytesIO(sess['template_file'])
wb = openpyxl.load_workbook(template_bytes)
# Setup Excel
wb = openpyxl.load_workbook(BytesIO(sess['template_file']))
ws = wb.active
rows_per_page = sess['rows_per_page'] or 30
detail_start_row = sess['detail_start_row'] or 11
# Clear existing breaks
ws.row_breaks.brk = []
ws.col_breaks.brk = []
# Calculate how many pages we need
total_scans = len(scans) if scans else 0
num_pages = max(1, (total_scans + rows_per_page - 1) // rows_per_page) if total_scans > 0 else 1
# Calculate Pages Needed
total_items = len(scans)
total_pages = math.ceil(total_items / rows_per_page) if total_items > 0 else 1
# Helper function to fill header values on a sheet
def fill_header(worksheet, header_fields):
# --- MAIN LOOP ---
for page_idx in range(total_pages):
# 1. Fill Header
for field in header_fields:
if field['excel_cell'] and field['field_value']:
try:
worksheet[field['excel_cell']] = field['field_value']
except:
pass # Skip invalid cell references
col_letter, row_str = coordinate_from_string(field['excel_cell'])
base_row = int(row_str)
target_row = base_row + (page_idx * page_height)
ws[f"{col_letter}{target_row}"] = field['field_value']
except: pass
# Helper function to clear detail rows on a sheet
def clear_details(worksheet, detail_fields, start_row, num_rows):
for i in range(num_rows):
row_num = start_row + i
for field in detail_fields:
if field['excel_cell']:
try:
col_letter = field['excel_cell'].upper().strip()
cell_ref = f"{col_letter}{row_num}"
worksheet[cell_ref] = None
except:
pass
# Helper function to fill detail rows on a sheet
def fill_details(worksheet, scans_subset, detail_fields, start_row):
for i, scan in enumerate(scans_subset):
row_num = start_row + i
for field in detail_fields:
if field['excel_cell']:
try:
col_letter = field['excel_cell'].upper().strip()
cell_ref = f"{col_letter}{row_num}"
value = scan[field['field_name']]
# Convert to appropriate type
if field['field_type'] == 'REAL' and value:
value = float(value)
elif field['field_type'] == 'INTEGER' and value:
value = int(value)
worksheet[cell_ref] = value
except Exception as e:
print(f"Error filling cell: {e}")
# Fill the first page
fill_header(ws, header_fields)
first_page_scans = scans[:rows_per_page] if scans else []
fill_details(ws, first_page_scans, detail_fields, detail_start_row)
# Create additional pages if needed
for page_num in range(2, num_pages + 1):
# Copy the worksheet within the same workbook
new_ws = wb.copy_worksheet(ws)
new_ws.title = f"Page {page_num}"
# Clear detail rows (they have Page 1 data)
clear_details(new_ws, detail_fields, detail_start_row, rows_per_page)
# Fill details for this page
start_idx = (page_num - 1) * rows_per_page
# 2. Fill Details
start_idx = page_idx * rows_per_page
end_idx = start_idx + rows_per_page
page_scans = scans[start_idx:end_idx]
fill_details(new_ws, page_scans, detail_fields, detail_start_row)
# Rename first sheet if we have multiple pages
if num_pages > 1:
ws.title = "Page 1"
for i, scan in enumerate(page_scans):
target_row = detail_start_row + (page_idx * page_height) + i
for field in detail_fields:
if field['excel_cell']:
try:
col_letter = field['excel_cell'].upper().strip()
cell_ref = f"{col_letter}{target_row}"
value = scan[field['field_name']]
if field['field_type'] == 'REAL' and value: value = float(value)
elif field['field_type'] == 'INTEGER' and value: value = int(value)
ws[cell_ref] = value
except: pass
# Save to BytesIO
# 3. Force Page Break (BEFORE the new header)
if page_idx < total_pages - 1:
next_page_start_row = ((page_idx + 1) * page_height) # No +1 here!
ws.row_breaks.append(Break(id=next_page_start_row))
# --- STEP 3: CLEANUP (Hide Unused Rows) ---
last_used_row = (total_pages * page_height)
SAFE_MAX_ROW = 5000
for row_num in range(last_used_row + 1, SAFE_MAX_ROW):
ws.row_dimensions[row_num].hidden = True
# --- FINAL POLISH (Manual Widths) ---
# --- FIX 2: Use bracket notation (sess['col']) instead of .get() ---
# We use 'or' to provide defaults if the DB value is None
start_col = sess['print_start_col'] or 'A'
if sess['print_end_col']:
end_col = sess['print_end_col']
else:
# Fallback to auto-detection if user left it blank
end_col = get_column_letter(ws.max_column)
# Set Print Area
ws.print_area = f"{start_col}1:{end_col}{last_used_row}"
if ws.sheet_properties.pageSetUpPr:
ws.sheet_properties.pageSetUpPr.fitToPage = False
# Save
output = BytesIO()
wb.save(output)
output.seek(0)
# Generate filename
timestamp = datetime.now().strftime('%Y%m%d_%H%M%S')
base_filename = f"{sess['process_key']}_{session_id}_{timestamp}"
# Check if PDF export is requested
export_format = request.args.get('format', 'xlsx')
print(f"DEBUG: Export format requested: {export_format}")
if export_format == 'pdf':
# Use win32com to convert to PDF (requires Excel installed)
try:
import tempfile
import pythoncom
import win32com.client as win32
print("DEBUG: pywin32 imported successfully")
# Save Excel to temp file
temp_xlsx = tempfile.NamedTemporaryFile(suffix='.xlsx', delete=False)
temp_xlsx.write(output.getvalue())
temp_xlsx.close()
print(f"DEBUG: Temp Excel saved to: {temp_xlsx.name}")
temp_pdf = temp_xlsx.name.replace('.xlsx', '.pdf')
# Initialize COM for this thread
pythoncom.CoInitialize()
print("DEBUG: COM initialized")
try:
excel = win32.Dispatch('Excel.Application')
excel.Visible = False
excel.DisplayAlerts = False
print("DEBUG: Excel application started")
workbook = excel.Workbooks.Open(temp_xlsx.name)
print("DEBUG: Workbook opened")
workbook.ExportAsFixedFormat(0, temp_pdf) # 0 = PDF format
print(f"DEBUG: Exported to PDF: {temp_pdf}")
workbook.Close(False)
excel.Quit()
print("DEBUG: Excel closed")
finally:
pythoncom.CoUninitialize()
# Read the PDF
with open(temp_pdf, 'rb') as f:
pdf_data = f.read()
print(f"DEBUG: PDF read, size: {len(pdf_data)} bytes")
# Clean up temp files
import os
os.unlink(temp_xlsx.name)
os.unlink(temp_pdf)
print("DEBUG: Temp files cleaned up")
return Response(
pdf_data,
mimetype='application/pdf',
headers={'Content-Disposition': f'attachment; filename={base_filename}.pdf'}
)
except ImportError as e:
print(f"ERROR: Import failed - {e}")
# Fall back to Excel export
except Exception as e:
print(f"ERROR: PDF export failed - {e}")
import traceback
traceback.print_exc()
# Fall back to Excel export
# Default: return Excel file
print("DEBUG: Returning Excel file")
return Response(
output.getvalue(),
mimetype='application/vnd.openxmlformats-officedocument.spreadsheetml.sheet',

View File

@@ -116,11 +116,18 @@ def my_counts(session_id):
FROM LocationCounts lc
LEFT JOIN ScanEntries se ON lc.location_count_id = se.location_count_id AND se.is_deleted = 0
WHERE lc.session_id = ?
AND lc.counted_by = ?
AND lc.status = 'in_progress'
AND lc.is_deleted = 0
AND (
lc.counted_by = ?
OR lc.location_count_id IN (
SELECT location_count_id FROM ScanEntries
WHERE scanned_by = ? AND is_deleted = 0
)
)
GROUP BY lc.location_count_id
ORDER BY lc.start_timestamp DESC
''', [session_id, session['user_id']])
''', [session_id, session['user_id'], session['user_id']])
# Get this user's completed bins
completed_bins = query_db('''
@@ -129,11 +136,17 @@ def my_counts(session_id):
FROM LocationCounts lc
LEFT JOIN ScanEntries se ON lc.location_count_id = se.location_count_id AND se.is_deleted = 0
WHERE lc.session_id = ?
AND lc.counted_by = ?
AND lc.status = 'completed'
AND (
lc.counted_by = ?
OR lc.location_count_id IN (
SELECT location_count_id FROM ScanEntries
WHERE scanned_by = ? AND is_deleted = 0
)
)
GROUP BY lc.location_count_id
ORDER BY lc.end_timestamp DESC
''', [session_id, session['user_id']])
ORDER BY lc.start_timestamp DESC
''', [session_id, session['user_id'], session['user_id']])
return render_template('counts/my_counts.html',
count_session=sess,
@@ -144,7 +157,7 @@ def my_counts(session_id):
@counting_bp.route('/session/<int:session_id>/start-bin', methods=['POST'])
@login_required
def start_bin_count(session_id):
"""Start counting a new bin"""
"""Start counting a new bin or resume an existing in-progress one"""
sess = get_active_session(session_id)
if not sess:
flash('Session not found or archived', 'warning')
@@ -159,6 +172,20 @@ def start_bin_count(session_id):
flash('Bin number is required', 'danger')
return redirect(url_for('counting.my_counts', session_id=session_id))
# --- NEW LOGIC: Check for existing in-progress bin ---
existing_bin = query_db('''
SELECT location_count_id
FROM LocationCounts
WHERE session_id = ? AND location_name = ? AND status = 'in_progress'
''', [session_id, location_name], one=True)
if existing_bin:
flash(f'Resuming bin: {location_name}', 'info')
return redirect(url_for('counting.count_location',
session_id=session_id,
location_count_id=existing_bin['location_count_id']))
# --- END NEW LOGIC ---
# Count expected lots from MASTER baseline for this location
expected_lots = query_db('''
SELECT COUNT(DISTINCT lot_number) as count
@@ -168,7 +195,7 @@ def start_bin_count(session_id):
expected_count = expected_lots['count'] if expected_lots else 0
# Create new location count
# Create new location count if none existed
conn = get_db()
cursor = conn.cursor()
@@ -184,7 +211,6 @@ def start_bin_count(session_id):
flash(f'Started counting bin: {location_name}', 'success')
return redirect(url_for('counting.count_location', session_id=session_id, location_count_id=location_count_id))
@counting_bp.route('/location/<int:location_count_id>/complete', methods=['POST'])
@login_required
def complete_location(location_count_id):
@@ -512,7 +538,7 @@ def scan_lot(session_id, location_count_id):
def delete_scan(entry_id):
"""Soft delete a scan and recalculate duplicate statuses"""
# Get the scan being deleted
scan = query_db('SELECT * FROM ScanEntries WHERE entry_id = ?', [entry_id], one=True)
scan = query_db('SELECT * FROM ScanEntries WHERE entry_id = ? AND is_deleted = 0', [entry_id], one=True)
if not scan:
return jsonify({'success': False, 'message': 'Scan not found'})
@@ -572,7 +598,7 @@ def update_scan(entry_id):
comment = data.get('comment', '')
# Get the scan
scan = query_db('SELECT * FROM ScanEntries WHERE entry_id = ?', [entry_id], one=True)
scan = query_db('SELECT * FROM ScanEntries WHERE entry_id = ? AND is_deleted = 0', [entry_id], one=True)
if not scan:
return jsonify({'success': False, 'message': 'Scan not found'})
@@ -593,7 +619,7 @@ def update_scan(entry_id):
actual_weight = ?,
comment = ?,
modified_timestamp = CURRENT_TIMESTAMP
WHERE entry_id = ?
WHERE entry_id = ? and is_deleted = 0
''', [item, weight, comment, entry_id])
return jsonify({'success': True, 'message': 'Scan updated'})
@@ -625,7 +651,7 @@ def recalculate_duplicate_status(session_id, lot_number, current_location):
duplicate_info = NULL,
comment = NULL,
modified_timestamp = CURRENT_TIMESTAMP
WHERE entry_id = ?
WHERE entry_id = ? and is_deleted = 0
''', [scan['entry_id']])
updated_entries.append({
'entry_id': scan['entry_id'],
@@ -670,7 +696,7 @@ def recalculate_duplicate_status(session_id, lot_number, current_location):
duplicate_info = ?,
comment = ?,
modified_timestamp = CURRENT_TIMESTAMP
WHERE entry_id = ?
WHERE entry_id = ? and is_deleted = 0
''', [duplicate_status, duplicate_info, duplicate_info, scan['entry_id']])
# Update our tracking list
@@ -689,7 +715,7 @@ def recalculate_duplicate_status(session_id, lot_number, current_location):
duplicate_info = ?,
comment = ?,
modified_timestamp = CURRENT_TIMESTAMP
WHERE entry_id = ?
WHERE entry_id = ? and is_deleted = 0
''', [duplicate_status, duplicate_info, duplicate_info, prev_scan['entry_id']])
# Update tracking for previous scans
@@ -755,3 +781,56 @@ def finish_location(session_id, location_count_id):
'success': True,
'redirect': url_for('counting.count_session', session_id=session_id)
})
@counting_bp.route('/session/<int:session_id>/finalize-all', methods=['POST'])
@login_required
def finalize_all_locations(session_id):
"""Finalize all 'in_progress' locations in a session"""
if session.get('role') not in ['owner', 'admin']:
return jsonify({'success': False, 'message': 'Permission denied'}), 403
# 1. Get all in_progress locations for this session
locations = query_db('''
SELECT location_count_id, location_name
FROM LocationCounts
WHERE session_id = ?
AND status = 'in_progress'
AND is_deleted = 0
''', [session_id])
if not locations:
return jsonify({'success': True, 'message': 'No open bins to finalize.'})
# 2. Loop through and run the finalize logic for each
for loc in locations:
# We reuse the logic from your existing finish_location route
execute_db('''
UPDATE LocationCounts
SET status = 'completed', end_timestamp = CURRENT_TIMESTAMP
WHERE location_count_id = ?
''', [loc['location_count_id']])
# Identify missing lots from MASTER baseline
expected_lots = query_db('''
SELECT lot_number, item, description, system_quantity
FROM BaselineInventory_Master
WHERE session_id = ? AND system_bin = ?
''', [session_id, loc['location_name']])
scanned_lots = query_db('''
SELECT DISTINCT lot_number
FROM ScanEntries
WHERE location_count_id = ? AND is_deleted = 0
''', [loc['location_count_id']])
scanned_lot_numbers = {s['lot_number'] for s in scanned_lots}
for expected in expected_lots:
if expected['lot_number'] not in scanned_lot_numbers:
execute_db('''
INSERT INTO MissingLots (session_id, lot_number, master_expected_location, item, master_expected_quantity, marked_by)
VALUES (?, ?, ?, ?, ?, ?)
''', [session_id, expected['lot_number'], loc['location_name'],
expected['item'], expected['system_quantity'], session['user_id']])
return jsonify({'success': True, 'message': f'Successfully finalized {len(locations)} bins.'})

View File

@@ -24,7 +24,7 @@ def create_session():
flash(f'Session "{session_name}" created successfully!', 'success')
return redirect(url_for('sessions.session_detail', session_id=session_id))
return render_template('create_session.html')
return render_template('/counts/create_session.html')
@sessions_bp.route('/session/<int:session_id>')
@@ -54,24 +54,38 @@ def session_detail(session_id):
''', [session_id], one=True)
# Get location progress
# We add a subquery to count the actual missing lots for each bin
locations = query_db('''
SELECT lc.*, u.full_name as counter_name
SELECT
lc.*,
u.full_name as counter_name,
(SELECT COUNT(*) FROM MissingLots ml
WHERE ml.session_id = lc.session_id
AND ml.master_expected_location = lc.location_name) as lots_missing_calc
FROM LocationCounts lc
LEFT JOIN Users u ON lc.counted_by = u.user_id
WHERE lc.session_id = ?
AND lc.is_deleted = 0
ORDER BY lc.status DESC, lc.location_name
''', [session_id])
# Get active counters
active_counters = query_db('''
SELECT DISTINCT u.full_name, lc.location_name, lc.start_timestamp
SELECT
u.full_name,
u.user_id,
MAX(lc.start_timestamp) AS start_timestamp, -- Add the alias here!
lc.location_name
FROM LocationCounts lc
JOIN Users u ON lc.counted_by = u.user_id
WHERE lc.session_id = ? AND lc.status = 'in_progress'
ORDER BY lc.start_timestamp DESC
WHERE lc.session_id = ?
AND lc.status = 'in_progress'
AND lc.is_deleted = 0
GROUP BY u.user_id
ORDER BY start_timestamp DESC
''', [session_id])
return render_template('session_detail.html',
return render_template('/counts/session_detail.html',
count_session=sess,
stats=stats,
locations=locations,
@@ -98,6 +112,7 @@ def get_status_details(session_id, status):
WHERE se.session_id = ?
AND se.master_status = 'match'
AND se.duplicate_status = '00'
AND se.master_variance_lbs = 0
AND se.is_deleted = 0
ORDER BY se.scan_timestamp DESC
''', [session_id])
@@ -184,20 +199,21 @@ def get_status_details(session_id, status):
# Missing lots (in master but not scanned)
items = query_db('''
SELECT
bim.lot_number,
bim.item,
ml.lot_number,
ml.item,
bim.description,
bim.system_bin,
bim.system_quantity
FROM BaselineInventory_Master bim
WHERE bim.session_id = ?
AND bim.lot_number NOT IN (
SELECT lot_number
FROM ScanEntries
WHERE session_id = ? AND is_deleted = 0
)
ORDER BY bim.system_bin, bim.lot_number
''', [session_id, session_id])
ml.master_expected_location as system_bin,
ml.master_expected_quantity as system_quantity
FROM MissingLots ml
LEFT JOIN BaselineInventory_Master bim ON
ml.lot_number = bim.lot_number AND
ml.item = bim.item AND
ml.master_expected_location = bim.system_bin AND
ml.session_id = bim.session_id
WHERE ml.session_id = ?
GROUP BY ml.lot_number, ml.item, ml.master_expected_location
ORDER BY ml.master_expected_location, ml.lot_number
''', [session_id])
else:
return jsonify({'success': False, 'message': 'Invalid status'})
@@ -242,3 +258,39 @@ def activate_session(session_id):
execute_db('UPDATE CountSessions SET status = ? WHERE session_id = ?', ['active', session_id])
return jsonify({'success': True, 'message': 'Session activated successfully'})
@sessions_bp.route('/session/<int:session_id>/get_stats')
@role_required('owner', 'admin')
def get_session_stats(session_id):
stats = query_db('''
SELECT
COUNT(DISTINCT se.entry_id) FILTER (WHERE se.master_status = 'match' AND se.duplicate_status = '00' AND se.master_variance_lbs = 0 AND se.is_deleted = 0 AND ABS(se.actual_weight - se.master_expected_weight) < 0.01) as matched,
COUNT(DISTINCT se.lot_number) FILTER (WHERE se.duplicate_status IN ('01', '03', '04') AND se.is_deleted = 0) as duplicates,
COUNT(DISTINCT se.entry_id) FILTER (WHERE se.master_status = 'match' AND se.duplicate_status = '00' AND se.is_deleted = 0 AND ABS(se.actual_weight - se.master_expected_weight) >= 0.01) as discrepancy,
COUNT(DISTINCT se.entry_id) FILTER (WHERE se.master_status = 'wrong_location' AND se.is_deleted = 0) as wrong_location,
COUNT(DISTINCT se.entry_id) FILTER (WHERE se.master_status = 'ghost_lot' AND se.is_deleted = 0) as ghost_lots,
COUNT(DISTINCT ml.missing_id) as missing
FROM CountSessions cs
LEFT JOIN ScanEntries se ON cs.session_id = se.session_id
LEFT JOIN MissingLots ml ON cs.session_id = ml.session_id
WHERE cs.session_id = ?
''', [session_id], one=True)
return jsonify(success=True, stats=dict(stats))
@sessions_bp.route('/session/<int:session_id>/active-counters-fragment')
@role_required('owner', 'admin')
def active_counters_fragment(session_id):
# Use that unique-user query we just built together
active_counters = query_db('''
SELECT
u.full_name,
MAX(lc.start_timestamp) AS start_timestamp,
lc.location_name
FROM LocationCounts lc
JOIN Users u ON lc.counted_by = u.user_id
WHERE lc.session_id = ? AND lc.status = 'in_progress' AND lc.is_deleted = 0
GROUP BY u.user_id
''', [session_id])
# This renders JUST the list part, not the whole page
return render_template('counts/partials/_active_counters.html', active_counters=active_counters)

View File

@@ -198,6 +198,57 @@ def migration_005_add_cons_process_fields_duplicate_key():
conn.commit()
conn.close()
def migration_006_add_is_deleted_to_locationcounts():
"""Add is_deleted column to LocationCounts table"""
conn = get_db()
if table_exists('LocationCounts'):
if not column_exists('LocationCounts', 'is_deleted'):
conn.execute('ALTER TABLE LocationCounts ADD COLUMN is_deleted INTEGER DEFAULT 0')
print(" Added is_deleted column to LocationCounts")
conn.commit()
conn.close()
def migration_007_add_detail_end_row():
"""Add detail_end_row column to cons_processes table"""
conn = get_db()
if table_exists('cons_processes'):
if not column_exists('cons_processes', 'detail_end_row'):
conn.execute('ALTER TABLE cons_processes ADD COLUMN detail_end_row INTEGER')
print(" Added detail_end_row column to cons_processes")
conn.commit()
conn.close()
def migration_008_add_page_height():
"""Add page_height column to cons_processes table"""
conn = get_db()
if table_exists('cons_processes'):
if not column_exists('cons_processes', 'page_height'):
conn.execute('ALTER TABLE cons_processes ADD COLUMN page_height INTEGER')
print(" Added page_height column to cons_processes")
conn.commit()
conn.close()
def migration_009_add_print_columns():
"""Add print_start_col and print_end_col to cons_processes"""
conn = get_db()
if table_exists('cons_processes'):
if not column_exists('cons_processes', 'print_start_col'):
conn.execute('ALTER TABLE cons_processes ADD COLUMN print_start_col TEXT DEFAULT "A"')
print(" Added print_start_col")
if not column_exists('cons_processes', 'print_end_col'):
conn.execute('ALTER TABLE cons_processes ADD COLUMN print_end_col TEXT')
print(" Added print_end_col")
conn.commit()
conn.close()
# List of all migrations in order
MIGRATIONS = [
@@ -206,6 +257,10 @@ MIGRATIONS = [
(3, 'add_default_modules', migration_003_add_default_modules),
(4, 'assign_modules_to_admins', migration_004_assign_modules_to_admins),
(5, 'add_cons_process_fields_duplicate_key', migration_005_add_cons_process_fields_duplicate_key),
(6, 'add_is_deleted_to_locationcounts', migration_006_add_is_deleted_to_locationcounts),
(7, 'add_detail_end_row', migration_007_add_detail_end_row),
(8, 'add_page_height', migration_008_add_page_height),
(9, 'add_print_columns', migration_009_add_print_columns),
]

View File

@@ -1,3 +1,4 @@
Flask==3.1.2
Werkzeug==3.1.5
openpyxl
Pillow

View File

@@ -2283,3 +2283,22 @@ body {
background: var(--color-primary);
color: var(--color-bg);
}
/* ==================== ICON BUTTONS ==================== */
.btn-icon-only {
background: transparent;
border: none;
color: var(--color-text-muted);
cursor: pointer;
padding: 4px 8px;
transition: var(--transition);
font-size: 0.9rem;
display: inline-flex;
align-items: center;
justify-content: center;
}
.btn-icon-only:hover {
color: var(--color-danger);
transform: scale(1.1);
}

View File

@@ -11,8 +11,21 @@
</a>
<div>
<h1 class="page-title" style="margin-bottom: 0;">Consumption Sheets</h1>
<p class="page-subtitle" style="margin-bottom: 0;">Manage process types and templates</p>
<h1 class="page-title" style="margin-bottom: 0; {{ 'color: var(--color-danger);' if showing_archived else '' }}">
{{ 'Archived Processes' if showing_archived else 'Consumption Sheets' }}
</h1>
<p class="page-subtitle" style="margin-bottom: var(--space-xs);">Manage process types and templates</p>
{% if showing_archived %}
<a href="{{ url_for('cons_sheets.admin_processes') }}" style="font-size: 0.85rem; color: var(--color-primary); display: inline-flex; align-items: center; gap: 6px;">
<i class="fa-solid fa-eye"></i> Return to Active List
</a>
{% else %}
<a href="{{ url_for('cons_sheets.admin_processes', archived=1) }}" style="font-size: 0.85rem; color: var(--color-text-muted); display: inline-flex; align-items: center; gap: 6px;">
<i class="fa-solid fa-box-archive"></i> View Archived Processes
</a>
{% endif %}
</div>
</div>
@@ -25,13 +38,34 @@
<div class="sessions-grid">
{% for process in processes %}
<div class="session-card">
<div class="session-card-header">
<div class="session-card-header" style="display: flex; justify-content: space-between; align-items: flex-start;">
<div>
<h3 class="session-name">{{ process.process_name }}</h3>
<span class="session-type-badge">
{{ process.field_count or 0 }} fields
</span>
</div>
{% if showing_archived %}
<form method="POST"
action="{{ url_for('cons_sheets.restore_process', process_id=process.id) }}"
style="margin: 0;">
<button type="submit" class="btn-icon-only" title="Restore Process" style="color: var(--color-success);">
<i class="fa-solid fa-trash-arrow-up"></i>
</button>
</form>
{% else %}
<form method="POST"
action="{{ url_for('cons_sheets.delete_process', process_id=process.id) }}"
onsubmit="return confirm('Are you sure you want to delete {{ process.process_name }}?');"
style="margin: 0;">
<button type="submit" class="btn-icon-only" title="Delete Process">
<i class="fa-solid fa-trash"></i>
</button>
</form>
{% endif %}
</div>
<div class="session-meta">
<div class="meta-item">
<span class="meta-label">Key:</span>
@@ -57,6 +91,8 @@
</a>
</div>
</div>
{% endfor %}
</div>
{% else %}

View File

@@ -50,16 +50,40 @@
<form method="POST" action="{{ url_for('cons_sheets.update_template_settings', process_id=process.id) }}">
<div class="form-group">
<label for="rows_per_page" class="form-label">Rows Per Page</label>
<label for="rows_per_page" class="form-label">Rows Per Page (Capacity)</label>
<input type="number" id="rows_per_page" name="rows_per_page"
value="{{ process.rows_per_page or 30 }}" min="1" max="500" class="form-input">
<p class="form-hint">Max detail rows before starting a new page</p>
value="{{ process.rows_per_page or 30 }}" min="1" max="5000" class="form-input">
<p class="form-hint">How many items fit in the grid before we need a new page?</p>
</div>
<div class="form-group" style="flex: 1;">
<label for="print_start_col" class="form-label">Print Start Column</label>
<input type="text" id="print_start_col" name="print_start_col"
value="{{ process.print_start_col or 'A' }}" class="form-input"
placeholder="e.g. A" pattern="[A-Za-z]+" title="Letters only">
<p class="form-hint">First column to print.</p>
</div>
<div class="form-group" style="flex: 1;">
<label for="print_end_col" class="form-label">Print End Column</label>
<input type="text" id="print_end_col" name="print_end_col"
value="{{ process.print_end_col or '' }}" class="form-input"
placeholder="e.g. K" pattern="[A-Za-z]+" title="Letters only">
<p class="form-hint">Last column to print (defines width).</p>
</div>
<div class="form-group">
<label for="page_height" class="form-label">Page Height (Total Rows)</label>
<input type="number" id="page_height" name="page_height"
value="{{ process.page_height or '' }}" min="1" class="form-input">
<p class="form-hint">The exact distance (in Excel rows) from the top of Page 1 to the top of Page 2.</p>
</div>
<div class="form-group">
<label for="detail_start_row" class="form-label">Detail Start Row</label>
<input type="number" id="detail_start_row" name="detail_start_row"
value="{{ process.detail_start_row or 10 }}" min="1" max="500" class="form-input">
value="{{ process.detail_start_row or 10 }}" min="1" max="5000" class="form-input">
<p class="form-hint">Excel row number where detail data begins</p>
</div>

View File

@@ -101,6 +101,11 @@
<div class="scans-header">
<h3 class="scans-title">Scanned Items (<span id="scanListCount">{{ scans|length }}</span>)</h3>
</div>
<div style="margin-top: 10px;">
<button type="button" class="btn btn-secondary btn-sm" onclick="document.getElementById('importModal').style.display='flex'">
<i class="fa-solid fa-file-import"></i> Bulk Import Excel
</button>
</div>
<div id="scansList" class="scans-grid" style="--field-count: {{ detail_fields|length }};">
{% for scan in scans %}
<div class="scan-row scan-row-{{ scan.duplicate_status }}"
@@ -137,6 +142,39 @@
</div>
</div>
<div id="importModal" class="modal">
<div class="modal-content">
<div class="modal-header-bar">
<h3 class="modal-title">Bulk Import Data</h3>
<button type="button" class="btn-close-modal" onclick="document.getElementById('importModal').style.display='none'">&times;</button>
</div>
<div class="modal-body" style="text-align: center;">
<p style="color: var(--color-text-muted); margin-bottom: 20px;">
Upload an Excel file (.xlsx) to automatically populate this session.
<br><strong>Warning:</strong> This bypasses all validation checks.
</p>
<div style="margin-bottom: 30px; padding: 15px; background: var(--color-bg); border-radius: 8px;">
<p style="font-size: 0.9rem; margin-bottom: 10px;">Step 1: Get the correct format</p>
<a href="{{ url_for('cons_sheets.download_import_template', session_id=session['id']) }}" class="btn btn-secondary btn-sm">
<i class="fa-solid fa-download"></i> Download Template
</a>
</div>
<form action="{{ url_for('cons_sheets.import_session_data', session_id=session['id']) }}" method="POST" enctype="multipart/form-data">
<div style="margin-bottom: 20px;">
<input type="file" name="file" accept=".xlsx" class="file-input" required style="width: 100%;">
</div>
<button type="submit" class="btn btn-primary btn-block">
<i class="fa-solid fa-upload"></i> Upload & Process
</button>
</form>
</div>
</div>
</div>
<style>
.header-values { display: flex; flex-wrap: wrap; gap: var(--space-sm); margin: var(--space-sm) 0; }
.header-pill { background: var(--color-surface-elevated); padding: var(--space-xs) var(--space-sm); border-radius: var(--radius-sm); font-size: 0.8rem; color: var(--color-text-muted); }

View File

@@ -163,9 +163,7 @@
<a href="{{ url_for('counting.my_counts', session_id=session_id) }}" class="btn btn-secondary btn-block btn-lg">
← Back to My Counts
</a>
<button id="finishBtn" class="btn btn-success btn-block btn-lg" onclick="finishLocation()">
✓ Finish Location
</button>
{# Finish button moved to Admin Dashboard #}
</div>
</div>
<button class="scroll-to-top" onclick="window.scrollTo({top: 0, behavior: 'smooth'})">

View File

@@ -49,14 +49,6 @@
<a href="{{ url_for('counting.count_location', session_id=count_session.session_id, location_count_id=bin.location_count_id) }}" class="btn btn-primary btn-block">
Resume Counting
</a>
<div class="bin-actions-row">
<button class="btn btn-secondary" onclick="markComplete('{{ bin.location_count_id }}')">
✓ Mark Complete
</button>
<button class="btn btn-danger" onclick="deleteBinCount('{{ bin.location_count_id }}', '{{ bin.location_name }}')">
🗑️ Delete
</button>
</div>
</div>
</div>
{% endfor %}

View File

@@ -0,0 +1,12 @@
<div class="counter-list">
{% for counter in active_counters %}
<div class="counter-item">
<div class="counter-avatar">{{ counter.full_name[0] }}</div>
<div class="counter-info">
<div class="counter-name">{{ counter.full_name }}</div>
<div class="counter-location">📍 {{ counter.location_name }}</div>
</div>
<div class="counter-time">{{ counter.start_timestamp[11:16] }}</div>
</div>
{% endfor %}
</div>

View File

@@ -72,42 +72,42 @@
</div>
<!-- Statistics Section -->
<div class="section-card">
<div class="section-card">
<h2 class="section-title">Real-Time Statistics</h2>
<div class="stats-grid">
<div class="stat-card stat-match" onclick="showStatusDetails('match')">
<div class="stat-number">{{ stats.matched or 0 }}</div>
<div class="stat-number" id="count-matched">{{ stats.matched or 0 }}</div>
<div class="stat-label">✓ Matched</div>
</div>
<div class="stat-card stat-duplicate" onclick="showStatusDetails('duplicates')">
<div class="stat-number">{{ stats.duplicates or 0 }}</div>
<div class="stat-number" id="count-duplicates">{{ stats.duplicates or 0 }}</div>
<div class="stat-label">🔵 Duplicates</div>
</div>
<div class="stat-card stat-weight-disc" onclick="showStatusDetails('weight_discrepancy')">
<div class="stat-number">{{ stats.weight_discrepancy or 0 }}</div>
<div class="stat-number" id="count-discrepancy">{{ stats.weight_discrepancy or 0 }}</div>
<div class="stat-label">⚖️ Weight Discrepancy</div>
</div>
<div class="stat-card stat-wrong" onclick="showStatusDetails('wrong_location')">
<div class="stat-number">{{ stats.wrong_location or 0 }}</div>
<div class="stat-number" id="count-wrong">{{ stats.wrong_location or 0 }}</div>
<div class="stat-label">⚠ Wrong Location</div>
</div>
<div class="stat-card stat-ghost" onclick="showStatusDetails('ghost_lot')">
<div class="stat-number">{{ stats.ghost_lots or 0 }}</div>
<div class="stat-number" id="count-ghost">{{ stats.ghost_lots or 0 }}</div>
<div class="stat-label">🟣 Ghost Lots</div>
</div>
<div class="stat-card stat-missing" onclick="showStatusDetails('missing')">
<div class="stat-number">{{ stats.missing_lots or 0 }}</div>
<div class="stat-number" id="count-missing">{{ stats.missing_lots or 0 }}</div>
<div class="stat-label">🔴 Missing</div>
</div>
</div>
</div>
</div>
<!-- Active Counters Section -->
{% if active_counters %}
<div class="section-card">
<h2 class="section-title">Active Counters</h2>
<div class="counter-list">
<div id="active-counters-container"> <div class="counter-list">
{% for counter in active_counters %}
<div class="counter-item">
<div class="counter-avatar">{{ counter.full_name[0] }}</div>
@@ -125,7 +125,12 @@
<!-- Location Progress Section -->
{% if locations %}
<div class="section-card">
<h2 class="section-title">Location Progress</h2>
<div style="display: flex; justify-content: space-between; align-items: center; margin-bottom: var(--space-md);">
<h2 class="section-title" style="margin-bottom: 0;">Location Progress</h2>
<button class="btn btn-danger btn-sm" onclick="showFinalizeAllConfirm()">
⚠️ Finalize All Bins
</button>
</div>
<div class="location-table">
<table>
<thead>
@@ -157,7 +162,7 @@
<td>{{ loc.counter_name or '-' }}</td>
<td>{{ loc.expected_lots_master }}</td>
<td>{{ loc.lots_found }}</td>
<td>{{ loc.lots_missing }}</td>
<td>{{ loc.lots_missing_calc }}</td>
</tr>
{% endfor %}
</tbody>
@@ -198,6 +203,9 @@
<button id="reopenLocationBtn" class="btn btn-warning btn-sm" style="display: none;" onclick="showReopenConfirm()">
🔓 Reopen Location
</button>
<button id="deleteLocationBtn" class="btn btn-danger btn-sm" style="display: none;" onclick="showDeleteBinConfirm()">
🗑️ Delete Bin
</button>
<button class="btn btn-secondary btn-sm" onclick="exportLocationToCSV()">
📥 Export CSV
</button>
@@ -460,6 +468,9 @@ function showLocationDetails(locationCountId, locationName, status) {
// Show finalize or reopen button based on status
const finalizeBtn = document.getElementById('finalizeLocationBtn');
const reopenBtn = document.getElementById('reopenLocationBtn');
const deleteBtn = document.getElementById('deleteLocationBtn'); // ADD THIS LINE
deleteBtn.style.display = 'block';
if (status === 'in_progress') {
finalizeBtn.style.display = 'block';
@@ -621,8 +632,8 @@ function closeFinalizeConfirm() {
}
function confirmFinalize() {
// Note: The /complete endpoint is handled by blueprints/counting.py
fetch(`/location/${currentLocationId}/complete`, {
// Correctly points to the /finish route to trigger Missing Lot calculations
fetch(`/count/${CURRENT_SESSION_ID}/location/${currentLocationId}/finish`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
@@ -633,16 +644,18 @@ function confirmFinalize() {
if (data.success) {
closeFinalizeConfirm();
closeLocationModal();
location.reload(); // Reload to show updated status
location.reload(); // Reload to show updated status and Missing counts
} else {
alert(data.message || 'Error finalizing location');
}
})
.catch(error => {
console.error('Finalize Error:', error);
alert('Error: ' + error.message);
});
}
function showReopenConfirm() {
document.getElementById('reopenBinName').textContent = currentLocationName;
document.getElementById('reopenConfirmModal').style.display = 'flex';
@@ -717,5 +730,78 @@ function activateSession() {
}
});
}
function showFinalizeAllConfirm() {
if (confirm("⚠️ WARNING: This will finalize ALL open bins in this session and calculate missing items. This cannot be undone. Are you sure?")) {
fetch(`/session/${CURRENT_SESSION_ID}/finalize-all`, {
method: 'POST',
headers: {'Content-Type': 'application/json'}
})
.then(r => r.json())
.then(data => {
if (data.success) {
alert(data.message);
location.reload();
} else {
alert("Error: " + data.message);
}
});
}
}
function showDeleteBinConfirm() {
if (confirm(`⚠️ DANGER: Are you sure you want to delete ALL data for ${currentLocationName}? This will hide the bin from staff and wipe any missing lot flags.`)) {
fetch(`/location/${currentLocationId}/delete`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' }
})
.then(response => response.json())
.then(data => {
if (data.success) {
closeLocationModal();
location.reload();
} else {
alert(data.message || 'Error deleting bin');
}
})
.catch(error => {
alert('Error: ' + error.message);
});
}
}
function refreshDashboardStats() {
const sessionId = CURRENT_SESSION_ID;
fetch(`/session/${sessionId}/get_stats`)
.then(response => response.json())
.then(data => {
if (data.success) {
const s = data.stats;
// These IDs must match your HTML and the keys must match sessions.py
if (document.getElementById('count-matched')) document.getElementById('count-matched').innerText = s.matched;
if (document.getElementById('count-duplicates')) document.getElementById('count-duplicates').innerText = s.duplicates;
if (document.getElementById('count-discrepancy')) document.getElementById('count-discrepancy').innerText = s.discrepancy;
if (document.getElementById('count-wrong')) document.getElementById('count-wrong').innerText = s.wrong_location; // Fixed
if (document.getElementById('count-ghost')) document.getElementById('count-ghost').innerText = s.ghost_lots; // Fixed
if (document.getElementById('count-missing')) document.getElementById('count-missing').innerText = s.missing;
}
})
.catch(err => console.error('Error refreshing stats:', err));
fetch(`/session/${sessionId}/active-counters-fragment`)
.then(response => response.text())
.then(html => {
const container = document.getElementById('active-counters-container');
if (container) container.innerHTML = html;
})
.catch(err => console.error('Error refreshing counters:', err));
}
// This tells the browser: "Run the refresh function every 30 seconds"
setInterval(refreshDashboardStats, 30000);
// This runs it IMMEDIATELY once so you don't wait 30 seconds for the first update
refreshDashboardStats();
</script>
{% endblock %}