diff --git a/AI Prompt.txt b/AI Prompt.txt index 4a90af5..1968b58 100644 --- a/AI Prompt.txt +++ b/AI Prompt.txt @@ -3,89 +3,108 @@ You are **Carl** — a proud, detail-oriented software engineer who LOVES progra You are helping build a project called **Scanlook**. ## Scanlook (current product summary) -Scanlook is a web app for warehouse counting workflows. -Scanlook is modular. +Scanlook is a modular inventory management platform for warehouse operations. -Long-term goal: evolve into a WMS, but right now focus on making this workflow reliable. +Long-term goal: evolve into a full WMS, but right now focus on making workflows reliable and the module system robust. ## Operating rules (must follow) 1) **Be accurate, not fast.** Double-check code, SQL, and commands before sending. 2) **No assumptions about files/environment.** If you need code, schema, logs, config, versions, or screenshots, ask me to paste/upload them. -3) **Step-by-step only.** I’m a beginner: give ONE small step at a time, then wait for my result before continuing. -4) **No command dumps.** Don’t give long chains of commands. One command (or tiny set) per step. +3) **Step-by-step only.** I'm a beginner: give ONE small step at a time, then wait for my result before continuing. +4) **No command dumps.** Don't give long chains of commands. One command (or tiny set) per step. 5) **Keep it to the point.** Default to short answers. Only explain more if I ask. 6) **Verify safety.** Warn me before destructive actions (delete/overwrite/migrations). Offer a safer alternative. 7) **Evidence-based debugging.** Ask for exact error text/logs and versions before guessing. 8) **CSS changes:** Ask which device(s) the change is for (desktop/mobile/scanner) before editing. Each has its own file. -9) **Docker deployment:** Production runs in Docker on Linux (PortainerVM). Volume mounts only /app/database to preserve data between updates. -10) Database changes: Never tell user to "manually run SQL". Always add changes to migrations.py so they auto-apply on deployment. - +9) **Docker deployment:** Production runs in Docker with Gunicorn on Linux (PortainerVM). Volume mounts only /app/database to preserve data between updates. +10) **Database changes:** Never tell user to "manually run SQL". Always add changes to migrations.py so they auto-apply on deployment. ## How you should respond -- Start by confirming which mode we’re working on: Cycle Count or Physical Inventory. - Ask for the minimum needed info (3–6 questions max), then propose the next single step. - When writing code: keep it small, readable, and consistent with Flask best practices. - When writing SQL: be explicit about constraints/indexes that matter for lots/bins/sessions. - When talking workflow: always keep session isolation (shift-based counts) as a hard requirement. -## Scanlook (current product summary) -Scanlook is a web app for warehouse counting workflows built with Flask + SQLite. +## Scanlook Architecture -**Current Version:** 0.15.0 +**Current Version:** 0.17.1 **Tech Stack:** -- Backend: Python/Flask, raw SQL (no ORM), openpyxl (Excel file generation) +- Backend: Python 3.13, Flask, Gunicorn (production WSGI server) - Database: SQLite (located in /database/scanlook.db) - Frontend: Jinja2 templates, vanilla JS, custom CSS - CSS Architecture: Desktop-first with device-specific overrides - style.css (base/desktop) - mobile.css (phones, 360-767px) - scanner.css (MC9300 scanners, max-width 359px) -- Deployment: Docker container, Gitea for version control + container registry +- Deployment: Docker container with Gunicorn, Gitea for version control + container registry **Project Structure:** -- app.py (main Flask app, routes for auth + dashboard) -- /blueprints/ (modular routes: counting.py, sessions.py, users.py, data_imports.py, admin_locations.py) -- /templates/ (Jinja2 HTML templates) +- app.py (main Flask app, core routes, module loading) +- /blueprints/users.py (user management blueprint - non-modular) +- /modules/ (modular applications - invcount, conssheets) + - Each module has: __init__.py, routes.py, migrations.py, manifest.json, templates/ +- /templates/ (core templates: login.html, home.html, base.html, admin_dashboard.html, module_manager.html) - /static/css/ (style.css, mobile.css, scanner.css) - /database/ (scanlook.db, init_db.py) -- db.py (database helper functions: query_db, execute_db) +- db.py (database helper functions: query_db, execute_db, get_db) - utils.py (decorators: login_required, role_required) -- migrations.py (database migration system) +- migrations.py (core database migrations) +- module_manager.py (ModuleManager class - handles module lifecycle) +- Dockerfile (Python 3.13-slim, Gunicorn with 4 workers) +- docker-compose.yml (orchestrates scanlook container with volume for database) +- gunicorn_config.py (Gunicorn hooks for module loading in workers) -**Key Features (implemented):** -- Count Sessions with archive/activate functionality -- Master baseline upload (CSV) -- Current baseline upload (optional, for comparison) +**Module System (v0.17.0+):** +- **Modular Architecture:** Each module is a self-contained plugin with its own routes, templates, migrations +- **Module Structure:** + - manifest.json (metadata: name, version, author, icon, description) + - __init__.py (creates blueprint via create_blueprint()) + - routes.py (defines register_routes(bp) function) + - migrations.py (get_schema(), get_migrations()) + - templates/{module_key}/ (module-specific templates) +- **Module Manager UI:** /admin/modules - install/uninstall/activate/deactivate modules +- **Module Upload:** Drag-and-drop ZIP upload to add new modules +- **Module Installation:** Creates database tables, registers in Modules table, grants access to users +- **Module Uninstall:** Triple-confirmation flow, always deletes data (deactivate preserves data) +- **Auto-restart:** After module install, server restarts to load new routes + - Dev (Flask): Thread-based restart via os.execv() + - Production (Gunicorn): HUP signal to master for graceful worker reload +- **Database Tables:** + - Modules (module_id, name, module_key, version, author, description, icon, is_active, is_installed) + - UserModules (user_id, module_id) - grants access per user + +**Current Modules:** +1. **Inventory Counts (invcount)** - Cycle counts and physical inventory + - Routes: /invcount/ + - Tables: LocationCounts, ScanEntries, Sessions, etc. +2. **Consumption Sheets (conssheets)** - Production lot tracking with Excel export + - Routes: /conssheets/ + - Tables: cons_processes, cons_sessions, cons_process_fields, etc. + +**Key Features:** +- Modular plugin architecture with hot-reload capability +- Module Manager with drag-and-drop upload +- Session-based counting workflows with archive/activate +- Master/current baseline upload (CSV) - Staff scanning interface optimized for MC9300 Zebra scanners - Scan statuses: Match, Duplicate, Wrong Location, Ghost Lot, Weight Discrepancy -- Location/BIN workflow with Expected → Scanned flow -- Session isolation (archived sessions blocked from access) - Role-based access: owner, admin, staff - Auto-initialize database on first run -- Consumption Sheets module (production lot tracking with Excel export) - Database migration system (auto-applies schema changes on startup) +- Production-ready with Gunicorn multi-worker support -**Long-term goal:** Modular WMS with future modules for Shipping, Receiving, Transfers, Production. - -**Module System:** -- Modules table defines available modules (module_key used for routing) -- UserModules table tracks per-user access -- Home page (/home) shows module cards based on user's access -- Each module needs: database entry, route with access check, home page card -- New modules should go in /modules/{module_name}/ with: - - __init__.py (blueprint registration) - - routes.py (all routes) - - templates/ (module-specific templates) -- Current modules: - - Inventory Counts (counting) - - Consumption Sheets (cons_sheets) - +**Development vs Production:** +- **Dev:** Windows, Flask dev server (python app.py), auto-reload on file changes +- **Production:** Linux Docker container, Gunicorn with 4 workers, graceful reloads via HUP signal ## Quick Reference -- Database: SQLite at /database/scanlook.db +- Database: SQLite at /database/scanlook.db (volume-mounted in Docker) - Scanner viewport: 320px wide (MC9300) - Mobile breakpoint: 360-767px - Desktop: 768px+ - Git remote: https://tsngit.tsnx.net/stuff/ScanLook.git -- Docker registry: 10.44.44.33:3000/stuff/scanlook \ No newline at end of file +- Docker registry: tsngit.tsnx.net/stuff/scanlook +- Production server: Gunicorn with 4 workers, --timeout 120 +- Module folders: /modules/{module_key}/ +- Module manifest required fields: module_key, name, version, author, description, icon \ No newline at end of file diff --git a/app.py b/app.py index 9688439..8b8b8fa 100644 --- a/app.py +++ b/app.py @@ -28,7 +28,7 @@ app.config['PERMANENT_SESSION_LIFETIME'] = timedelta(hours=1) # 1. Define the version -APP_VERSION = '0.17.3' # Bumped version for modular architecture +APP_VERSION = '0.18.0' # 2. Inject it into all templates automatically @app.context_processor diff --git a/global_actions.py b/global_actions.py new file mode 100644 index 0000000..ad917b8 --- /dev/null +++ b/global_actions.py @@ -0,0 +1,125 @@ +from db import query_db, execute_db +from datetime import datetime + +def execute_pipeline(actions, barcode, context): + """ + Executes the chain of actions defined in the Rule. + Returns: {'success': bool, 'message': str, 'data': dict} + """ + field_values = {} + should_save = False + + for action in actions: + atype = action.get('type') + + # --- MAP (Extract) --- + if atype == 'map': + start = int(action.get('start', 1)) - 1 + end = int(action.get('end', len(barcode))) + target = action.get('field') + if target: + safe_end = min(end, len(barcode)) + if start < len(barcode): + field_values[target] = barcode[start:safe_end] + + # --- CLEAN (Format) --- + elif atype == 'clean': + target = action.get('field') + func = action.get('func') + if target in field_values: + val = str(field_values[target]) + if func == 'TRIM': field_values[target] = val.strip() + elif func == 'REMOVE_SPACES': field_values[target] = val.replace(" ", "") + elif func == 'UPPERCASE': field_values[target] = val.upper() + elif func == 'REMOVE_LEADING_ZEROS': field_values[target] = val.lstrip('0') + + # --- DUPLICATE CHECK (The Gatekeeper) --- + elif atype == 'duplicate': + target = action.get('field') + behavior = action.get('behavior', 'WARN') # Default to WARN + val = field_values.get(target) + + if val: + # 1. Check DB + same_sess = query_db(f"SELECT id FROM {context['table_name']} WHERE {target} = ? AND session_id = ? AND is_deleted=0", [val, context['session_id']], one=True) + other_sess = query_db(f"SELECT id FROM {context['table_name']} WHERE {target} = ? AND is_deleted=0", [val], one=True) + + is_dup = False + dup_msg = "" + + if same_sess: + is_dup = True + dup_msg = f"Already scanned in THIS session ({val})" + field_values['duplicate_status'] = 'dup_same_session' + field_values['duplicate_info'] = 'Duplicate in same session' + elif other_sess: + is_dup = True + dup_msg = f"Previously scanned in another session ({val})" + field_values['duplicate_status'] = 'dup_other_session' + field_values['duplicate_info'] = 'Duplicate from history' + else: + field_values['duplicate_status'] = 'normal' + field_values['duplicate_info'] = None + + # 2. Enforce Behavior + if is_dup: + if behavior == 'BLOCK': + # STRICT MODE: Stop immediately. + return { + 'success': False, + 'message': f"⛔ STRICT MODE: {dup_msg}. Entry denied.", + 'data': field_values + } + + elif behavior == 'WARN': + # WARN MODE: Ask user, unless they already clicked "Yes" + if not context.get('confirm_duplicate'): + return { + 'success': False, + 'needs_confirmation': True, + 'message': f"⚠️ {dup_msg}", + 'data': field_values + } + # --- USER INPUT (The Gatekeeper) --- + elif atype == 'input': + # 1. Check if we received the manual data (Weight) from the Save button + incoming_data = context.get('extra_data') + + # 2. If data exists, MERGE it and CONTINUE (Don't stop!) + if incoming_data: + # Update our main data list with the user's input (e.g. weight=164) + field_values.update(incoming_data) + continue # <--- RESUME PIPELINE (Goes to next rule, usually SAVE) + + # 3. If no data, STOP and ask for it + return { + 'success': False, + 'needs_input': True, + 'message': 'Opening Details Form...', + 'data': field_values + } + + # --- SAVE MARKER --- + elif atype == 'save': + should_save = True + + # --- RESULT --- + if should_save: + try: + # Commit to DB + cols = ['session_id', 'scanned_by', 'scanned_at'] + vals = [context['session_id'], context['user_id'], datetime.now()] + + for k, v in field_values.items(): + cols.append(k) + vals.append(v) + + placeholders = ', '.join(['?'] * len(cols)) + sql = f"INSERT INTO {context['table_name']} ({', '.join(cols)}) VALUES ({placeholders})" + execute_db(sql, vals) + + return {'success': True, 'message': 'Saved Successfully', 'data': field_values} + except Exception as e: + return {'success': False, 'message': f"Database Error: {str(e)}", 'data': field_values} + else: + return {'success': True, 'message': f"✅ Parsed: {field_values} (No Save Action)", 'data': field_values} \ No newline at end of file diff --git a/module_manager.py b/module_manager.py index 8b480a6..e761308 100644 --- a/module_manager.py +++ b/module_manager.py @@ -321,45 +321,63 @@ class ModuleManager: return {'success': True, 'message': f'Module {module["name"]} deactivated'} def load_active_modules(self, app): - """ - Load all active modules and register their blueprints with Flask app. - Called during app startup. - """ - modules = self.scan_available_modules() - active_modules = [m for m in modules if m['is_installed'] and m['is_active']] - - print(f"\n🔌 Loading {len(active_modules)} active module(s)...") - - for module in active_modules: - try: - # Import module's __init__.py - init_path = Path(module['path']) / '__init__.py' - if not init_path.exists(): - print(f" ⚠️ {module['name']}: Missing __init__.py") - continue - - spec = importlib.util.spec_from_file_location( - f"modules.{module['module_key']}", - init_path - ) - module_package = importlib.util.module_from_spec(spec) - spec.loader.exec_module(module_package) - - # Get blueprint from create_blueprint() - if hasattr(module_package, 'create_blueprint'): - blueprint = module_package.create_blueprint() - app.register_blueprint(blueprint) - print(f" ✅ {module['name']} loaded at {module.get('routes_prefix', '/unknown')}") - else: - print(f" ⚠️ {module['name']}: Missing create_blueprint() function") + """ + Load all active modules, run their migrations, and register blueprints. + Called during app startup. + """ + modules = self.scan_available_modules() + active_modules = [m for m in modules if m['is_installed'] and m['is_active']] - except Exception as e: - print(f" ❌ Failed to load {module['name']}: {e}") - import traceback - traceback.print_exc() - - print("✅ Module loading complete\n") + print(f"\n🔌 Loading {len(active_modules)} active module(s)...") + + for module in active_modules: + try: + # --- NEW: Run Migrations on Startup --- + migrations_path = Path(module['path']) / 'migrations.py' + if migrations_path.exists(): + # 1. Dynamically load the migrations.py file + spec_mig = importlib.util.spec_from_file_location(f"{module['module_key']}_mig", migrations_path) + mig_mod = importlib.util.module_from_spec(spec_mig) + spec_mig.loader.exec_module(mig_mod) + + # 2. Run the migrations + if hasattr(mig_mod, 'get_migrations'): + conn = get_db() + for version, name, func in mig_mod.get_migrations(): + # Your migrations are written safely (checking IF EXISTS), + # so running them on every boot is the correct Dev workflow. + func(conn) + conn.commit() # <--- CRITICAL: Saves the changes to the DB + conn.close() + # -------------------------------------- + # Import module's __init__.py + init_path = Path(module['path']) / '__init__.py' + if not init_path.exists(): + print(f" ⚠️ {module['name']}: Missing __init__.py") + continue + + spec = importlib.util.spec_from_file_location( + f"modules.{module['module_key']}", + init_path + ) + module_package = importlib.util.module_from_spec(spec) + spec.loader.exec_module(module_package) + + # Get blueprint from create_blueprint() + if hasattr(module_package, 'create_blueprint'): + blueprint = module_package.create_blueprint() + app.register_blueprint(blueprint) + print(f" ✅ {module['name']} loaded at {module.get('routes_prefix', '/unknown')}") + else: + print(f" ⚠️ {module['name']}: Missing create_blueprint() function") + + except Exception as e: + print(f" ❌ Failed to load {module['name']}: {e}") + import traceback + traceback.print_exc() + + print("✅ Module loading complete\n") # Global instance manager = ModuleManager() diff --git a/modules/conssheets/manifest.json b/modules/conssheets/manifest.json index c9224d8..238a590 100644 --- a/modules/conssheets/manifest.json +++ b/modules/conssheets/manifest.json @@ -1,7 +1,7 @@ { "module_key": "conssheets", "name": "Consumption Sheets", - "version": "1.0.0", + "version": "1.1.0", "author": "STUFF", "description": "Production lot tracking and consumption reporting with Excel export", "icon": "fa-clipboard-list", diff --git a/modules/conssheets/migrations.py b/modules/conssheets/migrations.py index 1c8ee04..d7ce353 100644 --- a/modules/conssheets/migrations.py +++ b/modules/conssheets/migrations.py @@ -67,12 +67,13 @@ def get_schema(): FOREIGN KEY (field_id) REFERENCES cons_process_fields(id) ); + -- Indexes CREATE INDEX IF NOT EXISTS idx_cons_process_fields_process ON cons_process_fields(process_id, table_type); CREATE INDEX IF NOT EXISTS idx_cons_process_fields_active ON cons_process_fields(process_id, is_active); CREATE INDEX IF NOT EXISTS idx_cons_sessions_process ON cons_sessions(process_id, status); CREATE INDEX IF NOT EXISTS idx_cons_sessions_user ON cons_sessions(created_by, status); - """ + """ def get_migrations(): @@ -130,9 +131,27 @@ def get_migrations(): cursor.execute('ALTER TABLE cons_processes ADD COLUMN print_end_col TEXT') print(" Added print_end_col column to cons_processes") + def migration_005_create_router_table(conn): + """Create table for IFTTT routing rules""" + cursor = conn.cursor() + cursor.execute(''' + CREATE TABLE IF NOT EXISTS cons_process_router ( + id INTEGER PRIMARY KEY AUTOINCREMENT, + process_id INTEGER NOT NULL, + line_number INTEGER NOT NULL, + rule_name TEXT, + match_pattern TEXT NOT NULL, -- The Regex/Format to match + actions_json TEXT NOT NULL, -- The sequence of THEN steps + is_active INTEGER DEFAULT 1, + FOREIGN KEY (process_id) REFERENCES cons_processes(id) + ) + ''') + print(" Created cons_process_router table") + return [ (1, 'add_is_duplicate_key', migration_001_add_is_duplicate_key), (2, 'add_detail_end_row', migration_002_add_detail_end_row), (3, 'add_page_height', migration_003_add_page_height), (4, 'add_print_columns', migration_004_add_print_columns), + (5, 'create_router_table', migration_005_create_router_table), ] \ No newline at end of file diff --git a/modules/conssheets/routes.py b/modules/conssheets/routes.py index b7baf04..b1fd477 100644 --- a/modules/conssheets/routes.py +++ b/modules/conssheets/routes.py @@ -6,11 +6,13 @@ from flask import render_template, request, redirect, url_for, flash, jsonify, s from db import query_db, execute_db from utils import login_required, role_required from datetime import datetime +from global_actions import execute_pipeline import sqlite3 import io import os + def register_routes(bp): """Register all conssheets routes on the blueprint""" @@ -219,7 +221,103 @@ def register_routes(bp): header_fields=header_fields, detail_fields=detail_fields) + @bp.route('/admin/consumption-sheets//router') + @role_required('owner', 'admin') + def process_router(process_id): + """Configure IFTTT routing rules for a process""" + process = query_db('SELECT * FROM cons_processes WHERE id = ?', [process_id], one=True) + + if not process: + flash('Process not found', 'danger') + return redirect(url_for('conssheets.admin_processes')) + + # Get existing rules sorted by line number (10, 20...) + rules = query_db(''' + SELECT * FROM cons_process_router + WHERE process_id = ? + ORDER BY line_number ASC + ''', [process_id]) + + return render_template('conssheets/process_router.html', + process=process, + rules=rules) + @bp.route('/admin/consumption-sheets//router/add', methods=['POST']) + @role_required('owner', 'admin') + def add_router_rule(process_id): + """Add a new routing rule""" + process = query_db('SELECT * FROM cons_processes WHERE id = ?', [process_id], one=True) + if not process: + flash('Process not found', 'danger') + return redirect(url_for('conssheets.admin_processes')) + + line_number = request.form.get('line_number') + rule_name = request.form.get('rule_name') + match_pattern = request.form.get('match_pattern') + + # Basic validation + if not line_number or not rule_name or not match_pattern: + flash('All fields are required', 'danger') + return redirect(url_for('conssheets.process_router', process_id=process_id)) + + try: + execute_db(''' + INSERT INTO cons_process_router + (process_id, line_number, rule_name, match_pattern, actions_json, is_active) + VALUES (?, ?, ?, ?, '[]', 1) + ''', [process_id, line_number, rule_name, match_pattern]) + + flash(f'Rule {line_number} created successfully!', 'success') + except Exception as e: + flash(f'Error creating rule: {str(e)}', 'danger') + + return redirect(url_for('conssheets.process_router', process_id=process_id)) + + + @bp.route('/admin/consumption-sheets//router//edit', methods=['GET', 'POST']) + @role_required('owner', 'admin') + def edit_router_rule(process_id, rule_id): + """Edit a specific routing rule and its logic actions""" + process = query_db('SELECT * FROM cons_processes WHERE id = ?', [process_id], one=True) + rule = query_db('SELECT * FROM cons_process_router WHERE id = ?', [rule_id], one=True) + + if not process or not rule: + flash('Rule not found', 'danger') + return redirect(url_for('conssheets.process_router', process_id=process_id)) + + # NEW: Fetch all active fields so we can use them in the Logic Editor dropdowns + fields = query_db(''' + SELECT * FROM cons_process_fields + WHERE process_id = ? AND is_active = 1 + ORDER BY table_type, sort_order + ''', [process_id]) + + if request.method == 'POST': + # 1. Update Basic Info + line_number = request.form.get('line_number') + rule_name = request.form.get('rule_name') + match_pattern = request.form.get('match_pattern') + + # 2. Update the Logic Chain (JSON) + # We get the raw JSON string from a hidden input we'll build next + actions_json = request.form.get('actions_json', '[]') + + try: + execute_db(''' + UPDATE cons_process_router + SET line_number = ?, rule_name = ?, match_pattern = ?, actions_json = ? + WHERE id = ? + ''', [line_number, rule_name, match_pattern, actions_json, rule_id]) + + flash('Rule configuration saved!', 'success') + except Exception as e: + flash(f'Error saving rule: {str(e)}', 'danger') + + return redirect(url_for('conssheets.edit_router_rule', process_id=process_id, rule_id=rule_id)) + + return render_template('conssheets/edit_rule.html', process=process, rule=rule, fields=fields) + + @bp.route('/admin/consumption-sheets//fields') @role_required('owner', 'admin') def process_fields(process_id): @@ -693,136 +791,62 @@ def register_routes(bp): @bp.route('/session//scan', methods=['POST']) @login_required def scan_lot(session_id): - """Process a scan with duplicate detection using dynamic tables""" + from global_actions import execute_pipeline + import re + import json + + # 1. Setup Context & Get Session + # We need the process_key to know which table to save to sess = query_db(''' - SELECT cs.*, cp.process_key, cp.id as process_id - FROM cons_sessions cs - JOIN cons_processes cp ON cs.process_id = cp.id - WHERE cs.id = ? AND cs.status = 'active' + SELECT cs.*, cp.process_key, cp.id as process_id + FROM cons_sessions cs + JOIN cons_processes cp ON cs.process_id = cp.id + WHERE cs.id = ? ''', [session_id], one=True) - if not sess: - return jsonify({'success': False, 'message': 'Session not found or archived'}) + if not sess: + return jsonify({'success': False, 'message': 'Session invalid'}) + # 2. Get Data from Frontend data = request.get_json() - field_values = data.get('field_values', {}) # Dict of field_name: value - confirm_duplicate = data.get('confirm_duplicate', False) - check_only = data.get('check_only', False) + barcode = data.get('barcode', '').strip() - # Get the duplicate key field - dup_key_field = get_duplicate_key_field(sess['process_id']) + # 3. Find Matching Rule (The Routing) + matched_rule = None + if barcode: + rules = query_db('SELECT * FROM cons_process_router WHERE process_id = ? AND is_active = 1 ORDER BY line_number ASC', [sess['process_id']]) + for rule in rules: + try: + if re.search(rule['match_pattern'], barcode): + matched_rule = rule + break + except: continue - if not dup_key_field: - return jsonify({'success': False, 'message': 'No duplicate key field configured for this process'}) + if not matched_rule: + return jsonify({'success': False, 'message': f"❌ No rule matched: {barcode}"}) + + # 4. Execute Pipeline (The Processing) + context = { + 'table_name': f"cons_proc_{sess['process_key']}_details", + 'session_id': session_id, + 'user_id': session.get('user_id'), + + # CRITICAL FIXES: + # Pass the "Yes" flag so it doesn't ask about duplicates again + 'confirm_duplicate': data.get('confirm_duplicate', False), + + # Pass the "Weight" (or other inputs) so it doesn't open the form again + 'extra_data': data.get('field_values') or data.get('extra_data') + } - dup_key_value = field_values.get(dup_key_field['field_name'], '').strip() - - if not dup_key_value: - return jsonify({'success': False, 'message': f'{dup_key_field["field_label"]} is required'}) - - table_name = get_detail_table_name(sess['process_key']) - - # Check for duplicates in SAME session - same_session_dup = query_db(f''' - SELECT * FROM {table_name} - WHERE session_id = ? AND {dup_key_field['field_name']} = ? AND is_deleted = 0 - ''', [session_id, dup_key_value], one=True) - - # Check for duplicates in OTHER sessions (need to check all sessions of same process type) - other_session_dup = query_db(f''' - SELECT t.*, cs.id as other_session_id, cs.created_at as other_session_date, - u.full_name as other_user, - (SELECT field_value FROM cons_session_header_values - WHERE session_id = cs.id AND field_id = ( - SELECT id FROM cons_process_fields - WHERE process_id = cs.process_id AND field_name LIKE '%wo%' AND is_active = 1 LIMIT 1 - )) as other_wo - FROM {table_name} t - JOIN cons_sessions cs ON t.session_id = cs.id - JOIN Users u ON t.scanned_by = u.user_id - WHERE t.{dup_key_field['field_name']} = ? AND t.session_id != ? AND t.is_deleted = 0 - ORDER BY t.scanned_at DESC - LIMIT 1 - ''', [dup_key_value, session_id], one=True) - - duplicate_status = 'normal' - duplicate_info = None - needs_confirmation = False - - if same_session_dup: - duplicate_status = 'dup_same_session' - duplicate_info = 'Already scanned in this session' - needs_confirmation = True - elif other_session_dup: - duplicate_status = 'dup_other_session' - dup_date = other_session_dup['other_session_date'][:10] if other_session_dup['other_session_date'] else 'Unknown' - dup_user = other_session_dup['other_user'] or 'Unknown' - dup_wo = other_session_dup['other_wo'] or 'N/A' - duplicate_info = f"Previously scanned on {dup_date} by {dup_user} on WO {dup_wo}" - needs_confirmation = True - - # If just checking, return early - if check_only: - if needs_confirmation: - return jsonify({ - 'success': False, - 'needs_confirmation': True, - 'duplicate_status': duplicate_status, - 'duplicate_info': duplicate_info, - 'message': duplicate_info - }) - return jsonify({'success': True, 'needs_confirmation': False}) - - # If needs confirmation and not confirmed, ask user - if needs_confirmation and not confirm_duplicate: - return jsonify({ - 'success': False, - 'needs_confirmation': True, - 'duplicate_status': duplicate_status, - 'duplicate_info': duplicate_info, - 'message': duplicate_info - }) - - # Get all active detail fields for this process - detail_fields = query_db(''' - SELECT * FROM cons_process_fields - WHERE process_id = ? AND table_type = 'detail' AND is_active = 1 - ORDER BY sort_order, id - ''', [sess['process_id']]) - - # Build dynamic INSERT statement - field_names = ['session_id', 'scanned_by', 'duplicate_status', 'duplicate_info'] - field_placeholders = ['?', '?', '?', '?'] - values = [session_id, session['user_id'], duplicate_status, duplicate_info] - - for field in detail_fields: - field_names.append(field['field_name']) - field_placeholders.append('?') - values.append(field_values.get(field['field_name'], '')) - - insert_sql = f''' - INSERT INTO {table_name} ({', '.join(field_names)}) - VALUES ({', '.join(field_placeholders)}) - ''' - - detail_id = execute_db(insert_sql, values) - - # If this is a same-session duplicate, update the original scan too - updated_entry_ids = [] - if duplicate_status == 'dup_same_session' and same_session_dup: - execute_db(f''' - UPDATE {table_name} - SET duplicate_status = 'dup_same_session', duplicate_info = 'Duplicate' - WHERE id = ? - ''', [same_session_dup['id']]) - updated_entry_ids.append(same_session_dup['id']) - - return jsonify({ - 'success': True, - 'detail_id': detail_id, - 'duplicate_status': duplicate_status, - 'updated_entry_ids': updated_entry_ids - }) + try: + # The global engine handles Map, Clean, Duplicate, Input, and Save! + actions = json.loads(matched_rule['actions_json']) + result = execute_pipeline(actions, barcode, context) + return jsonify(result) + + except Exception as e: + return jsonify({'success': False, 'message': f"System Error: {str(e)}"}) @bp.route('/session//detail/') diff --git a/modules/conssheets/templates/conssheets/edit_rule.html b/modules/conssheets/templates/conssheets/edit_rule.html new file mode 100644 index 0000000..ed21468 --- /dev/null +++ b/modules/conssheets/templates/conssheets/edit_rule.html @@ -0,0 +1,408 @@ +{% extends 'base.html' %} + +{% block content %} +
+
+ +

Rule Configuration

+
+ +
+ + + +
+
+
+ +

Trigger (IF)

+
+
+
+ + + Execution order (e.g. 10) +
+ +
+ + +
+ +
+ +
+ + +
+ + .* = Match All
+ ^\d{8}-.* = Starts with 8 digits +
+
+ + +
+
+ +
+
+
+ +

Action Pipeline (THEN)

+
+ +
+
+ +
+ SCAN INPUT +
+
+ +
+
+ +
+
+ END RULE +
+ +
+
+
+
+
+ + + + + + +{% endblock %} \ No newline at end of file diff --git a/modules/conssheets/templates/conssheets/process_detail.html b/modules/conssheets/templates/conssheets/process_detail.html index 2d61f4b..5c4e9a4 100644 --- a/modules/conssheets/templates/conssheets/process_detail.html +++ b/modules/conssheets/templates/conssheets/process_detail.html @@ -71,6 +71,25 @@ Configure Template + +
+
+
🔀
+

Routing Rules

+
+

Configure IFTTT logic and barcode parsing

+ +
+
+ IFTTT + Logic Engine +
+
+ + + Configure Rules + +
diff --git a/modules/conssheets/templates/conssheets/process_router.html b/modules/conssheets/templates/conssheets/process_router.html new file mode 100644 index 0000000..b543a98 --- /dev/null +++ b/modules/conssheets/templates/conssheets/process_router.html @@ -0,0 +1,146 @@ +{% extends 'base.html' %} + +{% block content %} +
+ + +
+

+ Routing Rules + IFTTT Logic Engine +

+ +
+ +
+
+
Active Rules
+
+
+ {% if rules %} + + + + + + + + + + + + + {% for rule in rules %} + + + + + + + + + {% endfor %} + +
LineRule NameMatch Pattern (Regex)ActionsStatusOptions
{{ rule['line_number'] }}{{ rule['rule_name'] }}{{ rule['match_pattern'] }} + JSON Logic + + {% if rule['is_active'] %} + Active + {% else %} + Archived + {% endif %} + + Edit +
+ {% else %} +
+ +

No routing rules defined yet.

+

Rules allow you to auto-parse barcodes into multiple fields.

+
+ {% endif %} +
+
+
+ + +{% endblock %} \ No newline at end of file diff --git a/modules/conssheets/templates/conssheets/scan_session.html b/modules/conssheets/templates/conssheets/scan_session.html index f3fb4cf..eeefa24 100644 --- a/modules/conssheets/templates/conssheets/scan_session.html +++ b/modules/conssheets/templates/conssheets/scan_session.html @@ -26,20 +26,23 @@ {% endif %} -
-
-

Scan {{ dup_key_field.field_label if dup_key_field else 'Item' }}

+
+
+

🚀 Smart Router Scan

-
+
- - + +
- + +
+ -
@@ -149,7 +149,7 @@ function archiveSession(sessionId, processName) { return; } - fetch(`/cons-sheets/session/${sessionId}/archive`, { + fetch(`/conssheets/session/${sessionId}/archive`, { method: 'POST', headers: {'Content-Type': 'application/json'} })