# BI-CCC Implementation Guide — Developer Reference > **Audience**: Engineering Team > **Date**: February 2026 > **Status**: Approved for implementation > **Current Maturity**: 45/100 → **Target**: 85/100 --- ## Table of Contents 1. [Overview & Current State](#1-overview--current-state) 2. [Phase 1 — Quick Wins (Weeks 1-4)](#2-phase-1--quick-wins-weeks-1-4) 3. [Phase 2 — Data Architecture (Weeks 5-10)](#3-phase-2--data-architecture-weeks-5-10) 4. [Phase 3 — Compliance & Predictive Analytics (Weeks 11-16)](#4-phase-3--compliance--predictive-analytics-weeks-11-16) 5. [Phase 4 — Platform Maturity (Weeks 17-24)](#5-phase-4--platform-maturity-weeks-17-24) 6. [New KPIs to Implement](#6-new-kpis-to-implement) 7. [Architecture Reference](#7-architecture-reference) 8. [Code Refactoring Guide](#8-code-refactoring-guide) 9. [Infrastructure & DevOps](#9-infrastructure--devops) 10. [Appendix — SQL Templates](#10-appendix--sql-templates) --- ## 1. Overview & Current State ### What We Have Today ``` Stack: Node.js 20 + Express 4 + MySQL RDS (read-only) + SQLite + Chart.js Deploy: Docker Compose + Netbird VPN Auth: bcrypt + express-session (3 roles: admin/corporate/agente) ``` ### Critical Problems to Solve | Problem | Risk Level | Impact | |---------|-----------|--------| | Direct queries against production RDS | **CRITICAL** | Performance degradation, no historical data | | `queries.js` = 1,786 lines monolith | **HIGH** | Unmaintainable, hard to test | | `admin-bi.js` = 2,133 lines inline HTML/CSS/JS | **HIGH** | No component reuse, can't unit test | | No alerting system | **HIGH** | Issues discovered manually, delayed response | | No compliance/AML monitoring | **CRITICAL** | Regulatory exposure for forex platform | | No ETL pipeline | **HIGH** | No snapshots, no transformation layer | | No export/scheduled reports | **MEDIUM** | Manual report generation for stakeholders | --- ## 2. Phase 1 — Quick Wins (Weeks 1-4) ### 2.1 Alerting System **Goal**: Automated notifications when KPIs breach thresholds. **Stack**: `node-cron` + Slack Webhooks + Nodemailer **File**: `src/alerts/alert-engine.js` ```js // src/alerts/alert-engine.js const cron = require('node-cron'); const { getPool } = require('../db-rds'); const { sendSlack, sendEmail } = require('./channels'); const ALERTS = [ { name: 'volume-spike', cron: '*/30 * * * *', // every 30 min query: ` SELECT COUNT(*) as current_count, (SELECT COUNT(*) / 7 FROM br_transaction_to_usa WHERE created_at >= DATE_SUB(NOW(), INTERVAL 7 DAY) AND created_at < DATE_SUB(NOW(), INTERVAL 30 MINUTE)) as avg_count FROM br_transaction_to_usa WHERE created_at >= DATE_SUB(NOW(), INTERVAL 30 MINUTE) `, condition: (row) => row.current_count > row.avg_count * 1.5 || row.current_count < row.avg_count * 0.5, severity: 'P0', channels: ['slack', 'email'], message: (row) => `Volume Alert: ${row.current_count} txs in last 30min (avg: ${Math.round(row.avg_count)})` }, { name: 'spread-anomaly', cron: '*/15 * * * *', query: ` SELECT AVG(exchange_rate - ptax) as current_spread, (SELECT AVG(exchange_rate - ptax) FROM br_transaction_to_usa WHERE status = 'finalizado' AND created_at >= DATE_SUB(NOW(), INTERVAL 7 DAY)) as avg_spread, (SELECT STDDEV(exchange_rate - ptax) FROM br_transaction_to_usa WHERE status = 'finalizado' AND created_at >= DATE_SUB(NOW(), INTERVAL 7 DAY)) as stddev_spread FROM br_transaction_to_usa WHERE status = 'finalizado' AND created_at >= DATE_SUB(NOW(), INTERVAL 15 MINUTE) HAVING COUNT(*) > 0 `, condition: (row) => Math.abs(row.current_spread - row.avg_spread) > 2 * row.stddev_spread, severity: 'P0', channels: ['slack'], message: (row) => `Spread Anomaly: Current ${row.current_spread.toFixed(4)} vs avg ${row.avg_spread.toFixed(4)} (±2σ = ${(row.stddev_spread * 2).toFixed(4)})` }, { name: 'provider-down', cron: '0 */4 * * *', // every 4 hours query: ` SELECT pm.provider, MAX(t.created_at) as last_tx, TIMESTAMPDIFF(HOUR, MAX(t.created_at), NOW()) as hours_since FROM br_payment_methods pm LEFT JOIN br_transaction_to_usa t ON t.payment_method_id = pm.id AND t.created_at >= DATE_SUB(NOW(), INTERVAL 24 HOUR) GROUP BY pm.provider HAVING hours_since > 4 OR last_tx IS NULL `, condition: (rows) => rows.length > 0, severity: 'P0', channels: ['slack', 'email'], message: (rows) => `Provider Down: ${rows.map(r => `${r.provider} (${r.hours_since}h)`).join(', ')}` }, { name: 'large-transaction', cron: '*/5 * * * *', query: ` SELECT t.id_transaction_to_usa, t.amount_usd, c.nome, t.created_at FROM br_transaction_to_usa t JOIN conta c ON c.id_conta = t.id_conta WHERE t.amount_usd > 50000 AND t.created_at >= DATE_SUB(NOW(), INTERVAL 5 MINUTE) AND t.status = 'finalizado' `, condition: (rows) => rows.length > 0, severity: 'P1', channels: ['slack'], message: (rows) => `Large Tx: ${rows.map(r => `$${r.amount_usd.toLocaleString()} from ${r.nome}`).join('; ')}` }, { name: 'failed-tx-spike', cron: '*/15 * * * *', query: ` SELECT SUM(CASE WHEN status IN ('refunded','on-hold') THEN 1 ELSE 0 END) as failed, COUNT(*) as total, SUM(CASE WHEN status IN ('refunded','on-hold') THEN 1 ELSE 0 END) / COUNT(*) * 100 as fail_rate FROM br_transaction_to_usa WHERE created_at >= DATE_SUB(NOW(), INTERVAL 1 HOUR) HAVING total > 0 `, condition: (row) => row.fail_rate > 5, severity: 'P0', channels: ['slack', 'email'], message: (row) => `Failed Tx Spike: ${row.fail_rate.toFixed(1)}% failure rate (${row.failed}/${row.total}) in last hour` } ]; // Slack channel sender async function sendSlack(webhook, message, severity) { const colors = { P0: '#ef4444', P1: '#f59e0b', P2: '#3b82f6' }; await fetch(webhook, { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ attachments: [{ color: colors[severity] || '#94a3b8', title: `BI-CCC Alert [${severity}]`, text: message, ts: Math.floor(Date.now() / 1000) }] }) }); } module.exports = { ALERTS, initAlerts }; ``` **Env vars to add**: ```env SLACK_WEBHOOK_URL=https://hooks.slack.com/services/xxx/yyy/zzz ALERT_EMAIL_TO=ops@cambioreal.com SMTP_HOST=smtp.gmail.com SMTP_USER=alerts@cambioreal.com SMTP_PASS=app-password ``` ### 2.2 PDF/Excel Export **Goal**: Add export buttons to all dashboards. **Stack**: Already have `puppeteer` and `exceljs` in querycambio — move to shared dependency. **File**: `src/export/report-generator.js` ```js // src/export/report-generator.js const puppeteer = require('puppeteer'); const ExcelJS = require('exceljs'); async function exportDashboardToPDF(url, sessionCookie) { const browser = await puppeteer.launch({ headless: true, args: ['--no-sandbox'] }); const page = await browser.newPage(); await page.setCookie({ name: 'connect.sid', value: sessionCookie, domain: 'localhost' }); await page.goto(url, { waitUntil: 'networkidle0' }); // Hide nav, expand all sections await page.evaluate(() => { document.querySelector('.console-nav')?.remove(); document.querySelector('header')?.remove(); }); const pdf = await page.pdf({ format: 'A4', landscape: true, printBackground: true, margin: { top: '1cm', bottom: '1cm', left: '1cm', right: '1cm' } }); await browser.close(); return pdf; } async function exportToExcel(data, columns, sheetName = 'Report') { const workbook = new ExcelJS.Workbook(); const sheet = workbook.addWorksheet(sheetName); sheet.columns = columns.map(col => ({ header: col.label, key: col.key, width: col.width || 15, style: col.style || {} })); // Header styling sheet.getRow(1).font = { bold: true, size: 11 }; sheet.getRow(1).fill = { type: 'pattern', pattern: 'solid', fgColor: { argb: '7600BE' } }; sheet.getRow(1).font = { bold: true, color: { argb: 'FFFFFF' } }; data.forEach(row => sheet.addRow(row)); return workbook; } module.exports = { exportDashboardToPDF, exportToExcel }; ``` **API endpoints to add in `server.js`**: ```js // Export endpoints app.get('/admin/api/export/bi-pdf', requireRole('admin'), async (req, res) => { const { start, end } = req.query; const pdf = await exportDashboardToPDF( `http://localhost:${PORT}/admin/bi?start=${start}&end=${end}`, req.cookies['connect.sid'] ); res.setHeader('Content-Type', 'application/pdf'); res.setHeader('Content-Disposition', `attachment; filename=BI-Report-${start}-${end}.pdf`); res.send(pdf); }); app.get('/admin/api/export/clients-excel', requireRole('admin'), async (req, res) => { const data = await queries.getTopClients(); const workbook = await exportToExcel(data, [ { key: 'nome', label: 'Client', width: 30 }, { key: 'volume_usd', label: 'Volume (USD)', width: 18 }, { key: 'transactions', label: 'Transactions', width: 15 }, { key: 'revenue', label: 'Revenue (USD)', width: 18 }, { key: 'health_score', label: 'Health Score', width: 15 } ], 'Top Clients'); res.setHeader('Content-Type', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'); res.setHeader('Content-Disposition', 'attachment; filename=top-clients.xlsx'); await workbook.xlsx.write(res); }); ``` ### 2.3 Provider Performance Dashboard **Goal**: Compare all payin/payout providers in one view. **File**: `src/admin-providers.js` **Query**: `src/queries/provider-queries.js` ```sql -- Provider performance (payin) SELECT pm.provider, COUNT(*) as total_transactions, SUM(CASE WHEN t.status = 'finalizado' THEN 1 ELSE 0 END) as completed, SUM(CASE WHEN t.status = 'refunded' THEN 1 ELSE 0 END) as refunded, ROUND(SUM(CASE WHEN t.status = 'finalizado' THEN 1 ELSE 0 END) / COUNT(*) * 100, 2) as success_rate, ROUND(SUM(t.amount_usd), 2) as total_volume_usd, ROUND(AVG(t.amount_usd), 2) as avg_ticket, ROUND(AVG(TIMESTAMPDIFF(HOUR, t.created_at, t.date_received_br)), 1) as avg_settlement_hours, ROUND(AVG(t.exchange_rate - t.ptax), 4) as avg_spread FROM br_transaction_to_usa t JOIN br_payment_methods pm ON pm.id = t.payment_method_id WHERE t.created_at BETWEEN ? AND ? GROUP BY pm.provider ORDER BY total_volume_usd DESC; -- Provider performance (payout) SELECT ex.provider, COUNT(et.id) as total_transactions, SUM(CASE WHEN et.status = 'completed' THEN 1 ELSE 0 END) as completed, ROUND(SUM(CASE WHEN et.status = 'completed' THEN 1 ELSE 0 END) / COUNT(*) * 100, 2) as success_rate, ROUND(SUM(et.amount_brl), 2) as total_volume_brl FROM pagamento_br_export_transactions et JOIN pagamento_br_exports ex ON ex.id = et.export_id WHERE et.created_at BETWEEN ? AND ? GROUP BY ex.provider ORDER BY total_volume_brl DESC; ``` ### 2.4 Failed Transaction Analysis **File**: `src/admin-failed-transactions.js` ```sql -- Failed transactions breakdown SELECT t.status, pm.provider, COUNT(*) as count, ROUND(SUM(t.amount_usd), 2) as volume_lost, DATE(t.created_at) as date FROM br_transaction_to_usa t JOIN br_payment_methods pm ON pm.id = t.payment_method_id WHERE t.status IN ('refunded', 'on-hold') AND t.created_at BETWEEN ? AND ? GROUP BY t.status, pm.provider, DATE(t.created_at) ORDER BY date DESC, count DESC; ``` ### 2.5 New KPIs for Corporate Dashboard Add these to `GET /corporate/api/kpis`: ```sql -- Transaction Success Rate SELECT ROUND(SUM(CASE WHEN status = 'finalizado' THEN 1 ELSE 0 END) / COUNT(*) * 100, 2) as success_rate FROM br_transaction_to_usa WHERE created_at BETWEEN ? AND ?; -- Refund Rate SELECT ROUND(SUM(CASE WHEN status = 'refunded' THEN 1 ELSE 0 END) / COUNT(*) * 100, 2) as refund_rate FROM br_transaction_to_usa WHERE created_at BETWEEN ? AND ?; -- Gross Take Rate -- (Total Revenue / Total Volume) -- Use existing revenue calculation, divide by total volume ``` --- ## 3. Phase 2 — Data Architecture (Weeks 5-10) ### 3.1 Analytics Database Setup **Recommended**: ClickHouse (best price/performance for analytical queries) **Alternative**: TimescaleDB on PostgreSQL (easier migration path) #### Option A: ClickHouse ```yaml # docker-compose.yml addition services: clickhouse: image: clickhouse/clickhouse-server:24.1 container_name: bi-clickhouse ports: - "8123:8123" # HTTP - "9000:9000" # Native volumes: - clickhouse-data:/var/lib/clickhouse environment: CLICKHOUSE_DB: bi_analytics CLICKHOUSE_USER: bi_user CLICKHOUSE_PASSWORD: ${CLICKHOUSE_PASSWORD} ulimits: nofile: soft: 262144 hard: 262144 volumes: clickhouse-data: ``` #### ClickHouse Schema ```sql -- Transactions fact table (payin) CREATE TABLE bi_analytics.fact_payin ( id UInt64, created_at DateTime, date_received DateTime, id_conta UInt64, client_name String, agente_id Nullable(UInt32), amount_brl Decimal(18,2), amount_usd Decimal(18,2), exchange_rate Decimal(12,6), ptax Decimal(12,6), spread Decimal(12,6), fee Decimal(12,2), iof Decimal(12,2), pfee Decimal(12,2), bonus_valor Decimal(12,2), taxa_cr Decimal(12,2), revenue Decimal(12,2), provider String, status String, tipo_envio String, tipo_pagamento String, is_checkout UInt8, merchant_id Nullable(UInt64), merchant_name Nullable(String), -- Calculated fields spread_pct Decimal(8,4), settlement_hours Nullable(Float32) ) ENGINE = MergeTree() ORDER BY (created_at, id_conta) PARTITION BY toYYYYMM(created_at); -- Transactions fact table (payout) CREATE TABLE bi_analytics.fact_payout ( id UInt64, created_at DateTime, id_conta UInt64, sender_conta_id Nullable(UInt64), client_name String, valor_usd Decimal(18,2), valor_brl Decimal(18,2), ptax Decimal(12,6), cotacao Decimal(12,6), revenue Decimal(12,2), tipo_envio String, provider String, status String ) ENGINE = MergeTree() ORDER BY (created_at, id_conta) PARTITION BY toYYYYMM(created_at); -- Daily snapshots (materialized) CREATE TABLE bi_analytics.daily_metrics ( date Date, product String, -- 'cambioPay', 'cambioPayReverso', 'cambioCheckout' total_transactions UInt32, completed_transactions UInt32, failed_transactions UInt32, total_volume_usd Decimal(18,2), total_volume_brl Decimal(18,2), total_revenue Decimal(18,2), avg_spread Decimal(8,4), avg_ticket_usd Decimal(12,2), unique_clients UInt32, new_clients UInt32, avg_settlement_hours Nullable(Float32) ) ENGINE = MergeTree() ORDER BY (date, product); -- Client health snapshots CREATE TABLE bi_analytics.client_health_daily ( date Date, id_conta UInt64, client_name String, health_score UInt8, churn_risk String, -- 'low','medium','high','critical' lifetime_volume Decimal(18,2), lifetime_revenue Decimal(18,2), lifetime_transactions UInt32, last_transaction_date Nullable(Date), days_since_last_tx UInt16, monthly_velocity Decimal(12,2) ) ENGINE = MergeTree() ORDER BY (date, id_conta) PARTITION BY toYYYYMM(date); ``` ### 3.2 ETL Pipeline **File**: `src/etl/pipeline.js` ```js // src/etl/pipeline.js const cron = require('node-cron'); const mysql = require('../db-rds'); const clickhouse = require('../db-clickhouse'); const logger = require('../etl/logger'); const BATCH_SIZE = 5000; class ETLPipeline { // Run every hour — sync new transactions async syncPayinTransactions() { const jobId = logger.startJob('sync-payin'); try { // Get last synced ID const [lastRow] = await clickhouse.query( 'SELECT max(id) as last_id FROM bi_analytics.fact_payin' ); const lastId = lastRow?.last_id || 0; // Fetch new records from MySQL const [rows] = await mysql.query(` SELECT t.id_transaction_to_usa as id, t.created_at, t.date_received_br as date_received, t.id_conta, c.nome as client_name, ac.agente_id, t.amount_brl, t.amount_usd, t.exchange_rate, t.ptax, (t.exchange_rate - t.ptax) as spread, t.fee, t.iof, COALESCE(t.pfee, 0) as pfee, COALESCE(t.bonus_valor, 0) as bonus_valor, COALESCE(t.taxa_cr, 0) as taxa_cr, pm.provider, t.status, t.tipo_envio, t.tipo_pagamento, CASE WHEN t.cobranca_id IS NOT NULL THEN 1 ELSE 0 END as is_checkout, e.id as merchant_id, e.nome_empresa as merchant_name FROM br_transaction_to_usa t JOIN conta c ON c.id_conta = t.id_conta LEFT JOIN ag_contas ac ON ac.conta_id = t.id_conta LEFT JOIN br_payment_methods pm ON pm.id = t.payment_method_id LEFT JOIN br_cb_cobranca cb ON cb.id = t.cobranca_id LEFT JOIN br_cb_empresas e ON e.id = cb.empresa_id WHERE t.id_transaction_to_usa > ? ORDER BY t.id_transaction_to_usa LIMIT ? `, [lastId, BATCH_SIZE]); if (rows.length === 0) { logger.endJob(jobId, 'success', 0); return; } // Calculate revenue for each row const enriched = rows.map(row => ({ ...row, revenue: calculatePayinRevenue(row), spread_pct: row.ptax > 0 ? ((row.exchange_rate - row.ptax) / row.exchange_rate * 100) : 0, settlement_hours: row.date_received ? (new Date(row.date_received) - new Date(row.created_at)) / 3600000 : null })); // Insert into ClickHouse await clickhouse.insert('bi_analytics.fact_payin', enriched); logger.endJob(jobId, 'success', rows.length); } catch (err) { logger.endJob(jobId, 'error', 0, err.message); throw err; } } // Run daily at 1 AM — build daily aggregates async buildDailyMetrics(date) { const jobId = logger.startJob('daily-metrics'); const targetDate = date || new Date(Date.now() - 86400000).toISOString().split('T')[0]; await clickhouse.query(` INSERT INTO bi_analytics.daily_metrics SELECT toDate(created_at) as date, CASE WHEN is_checkout = 1 THEN 'cambioCheckout' ELSE 'cambioPay' END as product, count() as total_transactions, countIf(status = 'finalizado') as completed_transactions, countIf(status IN ('refunded', 'on-hold')) as failed_transactions, sum(amount_usd) as total_volume_usd, sum(amount_brl) as total_volume_brl, sum(revenue) as total_revenue, avg(spread_pct) as avg_spread, avg(amount_usd) as avg_ticket_usd, uniq(id_conta) as unique_clients, uniqIf(id_conta, /* first tx logic */) as new_clients, avg(settlement_hours) as avg_settlement_hours FROM bi_analytics.fact_payin WHERE toDate(created_at) = '${targetDate}' GROUP BY date, product `); logger.endJob(jobId, 'success'); } } // Revenue calculation (mirrors existing logic) function calculatePayinRevenue(row) { const noFeeProviders = ['ouribank', 'bs2']; const feeAjustado = noFeeProviders.includes(row.provider) ? 0 : (row.fee || 0); if (row.ptax <= 0) return 0; const revenue = (row.amount_brl - feeAjustado) / row.ptax - (row.pfee || 0) - (row.amount_usd + (row.bonus_valor || 0) - (row.taxa_cr || 0)); return Math.round(revenue * 100) / 100; } module.exports = ETLPipeline; ``` **Cron schedule** (add to `server.js` or separate worker): ```js const ETLPipeline = require('./src/etl/pipeline'); const etl = new ETLPipeline(); // Hourly: sync new transactions cron.schedule('0 * * * *', () => etl.syncPayinTransactions()); cron.schedule('5 * * * *', () => etl.syncPayoutTransactions()); // Daily at 1 AM: build aggregates cron.schedule('0 1 * * *', () => etl.buildDailyMetrics()); // Daily at 2 AM: build client health snapshots cron.schedule('0 2 * * *', () => etl.buildClientHealthSnapshots()); ``` ### 3.3 Data Quality Monitoring **File**: `src/etl/data-quality.js` ```js // Run after every ETL job async function validateDataQuality() { const checks = []; // 1. Row count comparison const [mysqlCount] = await mysql.query( 'SELECT COUNT(*) as cnt FROM br_transaction_to_usa WHERE status = "finalizado"' ); const [chCount] = await clickhouse.query( "SELECT count() as cnt FROM bi_analytics.fact_payin WHERE status = 'finalizado'" ); checks.push({ name: 'row-count-payin', expected: mysqlCount.cnt, actual: chCount.cnt, pass: Math.abs(mysqlCount.cnt - chCount.cnt) < 100 // tolerance for in-flight }); // 2. Null checks on critical fields const [nullCheck] = await clickhouse.query(` SELECT countIf(ptax = 0 OR ptax IS NULL) as null_ptax, countIf(amount_brl = 0) as zero_amount, countIf(provider = '') as empty_provider FROM bi_analytics.fact_payin WHERE created_at >= today() - 1 `); checks.push({ name: 'null-critical-fields', pass: nullCheck.null_ptax === 0 && nullCheck.zero_amount === 0 }); // 3. Revenue reconciliation const [revCheck] = await clickhouse.query(` SELECT abs(sum(revenue) - ${expectedRevenue}) / ${expectedRevenue} * 100 as deviation_pct FROM bi_analytics.fact_payin WHERE toDate(created_at) = yesterday() `); checks.push({ name: 'revenue-reconciliation', pass: revCheck.deviation_pct < 1 // less than 1% deviation }); return checks; } ``` --- ## 4. Phase 3 — Compliance & Predictive Analytics (Weeks 11-16) ### 4.1 AML Transaction Monitoring **File**: `src/compliance/aml-monitor.js` **New route**: `GET /admin/compliance` ```sql -- Threshold monitoring: transactions > $3,000 SELECT c.nome, t.id_conta, t.amount_usd, t.created_at, pm.provider FROM br_transaction_to_usa t JOIN conta c ON c.id_conta = t.id_conta JOIN br_payment_methods pm ON pm.id = t.payment_method_id WHERE t.amount_usd > 3000 AND t.status = 'finalizado' AND t.created_at BETWEEN ? AND ? ORDER BY t.amount_usd DESC; -- Structuring detection: multiple transactions just below threshold per client per day SELECT c.nome, t.id_conta, DATE(t.created_at) as tx_date, COUNT(*) as tx_count, SUM(t.amount_usd) as daily_total, AVG(t.amount_usd) as avg_amount, MIN(t.amount_usd) as min_amount, MAX(t.amount_usd) as max_amount FROM br_transaction_to_usa t JOIN conta c ON c.id_conta = t.id_conta WHERE t.status = 'finalizado' AND t.amount_usd BETWEEN 2500 AND 3000 AND t.created_at BETWEEN ? AND ? GROUP BY t.id_conta, c.nome, DATE(t.created_at) HAVING tx_count >= 2 ORDER BY daily_total DESC; -- Unusual velocity: clients with transaction frequency spike SELECT c.nome, t.id_conta, COUNT(*) as recent_count, (SELECT COUNT(*) FROM br_transaction_to_usa t2 WHERE t2.id_conta = t.id_conta AND t2.created_at BETWEEN DATE_SUB(?, INTERVAL 30 DAY) AND ?) / 30 as daily_avg, SUM(t.amount_usd) as recent_volume FROM br_transaction_to_usa t JOIN conta c ON c.id_conta = t.id_conta WHERE t.status = 'finalizado' AND t.created_at BETWEEN ? AND ? GROUP BY t.id_conta, c.nome HAVING recent_count > daily_avg * 3 ORDER BY recent_count DESC; ``` ### 4.2 IOF Reconciliation Dashboard ```sql -- IOF verification: compare calculated vs expected SELECT DATE(t.created_at) as date, pm.provider, COUNT(*) as transactions, SUM(t.iof) as total_iof_charged, SUM(t.amount_usd * t.exchange_rate * 0.0038) as expected_iof_standard, SUM(t.iof) - SUM(t.amount_usd * t.exchange_rate * 0.0038) as iof_delta, ROUND( ABS(SUM(t.iof) - SUM(t.amount_usd * t.exchange_rate * 0.0038)) / NULLIF(SUM(t.amount_usd * t.exchange_rate * 0.0038), 0) * 100, 2 ) as delta_pct FROM br_transaction_to_usa t JOIN br_payment_methods pm ON pm.id = t.payment_method_id WHERE t.status = 'finalizado' AND t.created_at BETWEEN ? AND ? GROUP BY DATE(t.created_at), pm.provider HAVING delta_pct > 0.5 ORDER BY date DESC; ``` ### 4.3 Volume Forecasting **Stack**: Python microservice with Prophet or Node.js with `timeseries-analysis` **File**: `src/analytics/forecast.js` (Node approach) ```js // Simple exponential smoothing for Node.js // For production, consider a Python sidecar with Prophet function exponentialSmoothing(data, alpha = 0.3) { const result = [data[0]]; for (let i = 1; i < data.length; i++) { result.push(alpha * data[i] + (1 - alpha) * result[i - 1]); } return result; } function forecast(historicalData, periodsAhead = 30) { // historicalData = [{ date, value }, ...] const values = historicalData.map(d => d.value); const smoothed = exponentialSmoothing(values); // Simple trend projection const recent = smoothed.slice(-7); const trend = (recent[recent.length - 1] - recent[0]) / recent.length; const lastValue = smoothed[smoothed.length - 1]; const predictions = []; for (let i = 1; i <= periodsAhead; i++) { const lastDate = new Date(historicalData[historicalData.length - 1].date); lastDate.setDate(lastDate.getDate() + i); predictions.push({ date: lastDate.toISOString().split('T')[0], predicted: Math.max(0, lastValue + trend * i), lower: Math.max(0, lastValue + trend * i - lastValue * 0.15), upper: lastValue + trend * i + lastValue * 0.15 }); } return predictions; } module.exports = { forecast }; ``` **API endpoint**: ```js app.get('/admin/api/bi/forecast', requireRole('admin'), async (req, res) => { const { metric = 'volume', days = 30 } = req.query; // Get 90 days of historical data from analytics DB const historical = await clickhouse.query(` SELECT date, sum(total_volume_usd) as value FROM bi_analytics.daily_metrics WHERE date >= today() - 90 GROUP BY date ORDER BY date `); const predictions = forecast(historical, parseInt(days)); res.json({ historical, predictions }); }); ``` ### 4.4 Churn Prediction Model Evolve the static Health Score into a weighted model: ```js function predictChurnRisk(client) { const weights = { recency: 0.30, // days since last transaction frequency: 0.25, // transactions per month trend monetary: 0.20, // volume trend (growing/declining) velocity: 0.15, // rate of change in transaction frequency engagement: 0.10 // product diversity, ticket variation }; // Recency score (0-100, higher = less risk) const recencyScore = Math.max(0, 100 - (client.days_since_last_tx * 2)); // Frequency score: compare last 30d vs previous 30d const freqRatio = client.tx_count_30d / Math.max(client.tx_count_prev_30d, 1); const frequencyScore = Math.min(100, freqRatio * 50); // Monetary score: volume growth const volRatio = client.volume_30d / Math.max(client.volume_prev_30d, 1); const monetaryScore = Math.min(100, volRatio * 50); // Velocity: acceleration of transaction frequency const velocityScore = client.tx_acceleration > 0 ? 70 + client.tx_acceleration * 10 : 30; // Engagement: uses multiple products? const engagementScore = client.products_used * 33; const totalScore = recencyScore * weights.recency + frequencyScore * weights.frequency + monetaryScore * weights.monetary + Math.min(100, velocityScore) * weights.velocity + Math.min(100, engagementScore) * weights.engagement; return { score: Math.round(totalScore), risk: totalScore < 25 ? 'critical' : totalScore < 45 ? 'high' : totalScore < 65 ? 'medium' : 'low', factors: { recency: { score: recencyScore, weight: weights.recency }, frequency: { score: frequencyScore, weight: weights.frequency }, monetary: { score: monetaryScore, weight: weights.monetary }, velocity: { score: Math.min(100, velocityScore), weight: weights.velocity }, engagement: { score: Math.min(100, engagementScore), weight: weights.engagement } } }; } ``` --- ## 5. Phase 4 — Platform Maturity (Weeks 17-24) ### 5.1 Scheduled Report Generation **File**: `src/reports/scheduler.js` ```js const cron = require('node-cron'); const { exportDashboardToPDF, exportToExcel } = require('../export/report-generator'); const { sendEmail } = require('../alerts/channels'); const REPORT_SCHEDULE = [ { name: 'Daily Executive Digest', cron: '0 8 * * 1-5', // 8 AM weekdays recipients: ['exec@cambioreal.com'], generator: async () => { const data = await queries.getDailyDigest(); return { subject: `BI Daily Digest — ${new Date().toLocaleDateString()}`, html: buildDigestHTML(data), attachments: [] // inline, no PDF }; } }, { name: 'Weekly Board Summary', cron: '0 9 * * 1', // Monday 9 AM recipients: ['board@cambioreal.com'], generator: async () => { const end = new Date().toISOString().split('T')[0]; const start = new Date(Date.now() - 7 * 86400000).toISOString().split('T')[0]; const pdf = await exportDashboardToPDF( `http://localhost:3080/admin/bi?start=${start}&end=${end}` ); return { subject: `BI Weekly Report — Week of ${start}`, html: '
Weekly BI report attached.
', attachments: [{ filename: `BI-Weekly-${start}.pdf`, content: pdf }] }; } }, { name: 'Monthly Agent Commissions', cron: '0 10 1 * *', // 1st of month, 10 AM recipients: ['finance@cambioreal.com'], generator: async () => { const data = await queries.getMonthlyCommissions(); const workbook = await exportToExcel(data, commissionColumns, 'Commissions'); const buffer = await workbook.xlsx.writeBuffer(); return { subject: `Agent Commissions — ${new Date().toLocaleDateString('en', { month: 'long', year: 'numeric' })}`, html: 'Monthly commission report attached.
', attachments: [{ filename: 'commissions.xlsx', content: buffer }] }; } } ]; ``` ### 5.2 Agent Commission Integration Move commission calculation from separate scripts (`comissao-valorfx/`, `comissao-d4u/`) into BI-CCC: **New route**: `GET /admin/commissions` **Features**: - Auto-calculate commissions per agent per period - Approval workflow (admin reviews → approves → generates Excel) - Commission dispute tracking - Agent ROI: commission paid vs. revenue generated by agent's clients ```js // src/queries/commission-queries.js // ValorFx (Agent 76) — 50/50 rateio async function calculateValorFxCommission(start, end) { const [rows] = await pool.query(` SELECT t.id_transaction_to_usa, t.created_at, c.nome as client_name, t.amount_brl, t.amount_usd, t.exchange_rate, t.ptax, t.fee, t.iof, t.pfee, t.bonus_valor, t.taxa_cr, pm.provider FROM br_transaction_to_usa t JOIN conta c ON c.id_conta = t.id_conta JOIN ag_contas ac ON ac.conta_id = t.id_conta JOIN br_payment_methods pm ON pm.id = t.payment_method_id WHERE ac.agente_id = 76 AND t.status = 'finalizado' AND t.created_at BETWEEN ? AND ? ORDER BY t.created_at `, [start, end]); return rows.map(row => { const noFeeProviders = ['ouribank', 'bs2']; const feeAdj = noFeeProviders.includes(row.provider) ? 0 : row.fee; const spreadLiqUsd = (row.amount_brl - feeAdj) / row.ptax - row.pfee - row.amount_usd + row.bonus_valor - row.taxa_cr; const iofEficiencia = row.provider === 'RIPPLE' ? row.iof : row.iof * 0.425; const rateio = (iofEficiencia + spreadLiqUsd) / 2; return { ...row, spreadLiqUsd, iofEficiencia, rateio }; }); } // D4U — 20% do spread async function calculateD4UCommission(start, end) { const d4uContas = [257174, 333262, 349175, 352427]; // ... similar structure, 20% of revenue } ``` ### 5.3 Frontend Migration Path **Recommended approach**: Gradual migration using htmx + Alpine.js #### Step 1: Extract UI Template Components ``` src/ ├── components/ │ ├── kpi-card.js // buildKPICard(title, value, change, color) │ ├── chart-wrapper.js // buildChartWrapper(id, type, config) │ ├── data-table.js // buildDataTable(columns, data, options) │ ├── filter-bar.js // buildFilterBar(filters) │ └── layout.js // buildPageLayout(title, nav, content) ``` #### Step 2: Add htmx for Progressive Enhancement ```html