- Rewrite README.md with current architecture, features and stack - Update docs/API.md with all current endpoints (corporate, BI, client 360) - Update docs/ARCHITECTURE.md with cache, modular queries, services, ETL - Update docs/GUIA-USUARIO.md for all roles (admin, corporate, agente) - Add docs/INDEX.md documentation index - Add PROJETO.md comprehensive project reference - Add BI-CCC-Implementation-Guide.md - Include AI agent configs (.claude, .agents, .gemini, _bmad) - Add netbird VPN configuration - Add status report Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
50 KiB
BI-CCC Implementation Guide — Developer Reference
Audience: Engineering Team Date: February 2026 Status: Approved for implementation Current Maturity: 45/100 → Target: 85/100
Table of Contents
- Overview & Current State
- Phase 1 — Quick Wins (Weeks 1-4)
- Phase 2 — Data Architecture (Weeks 5-10)
- Phase 3 — Compliance & Predictive Analytics (Weeks 11-16)
- Phase 4 — Platform Maturity (Weeks 17-24)
- New KPIs to Implement
- Architecture Reference
- Code Refactoring Guide
- Infrastructure & DevOps
- Appendix — SQL Templates
1. Overview & Current State
What We Have Today
Stack: Node.js 20 + Express 4 + MySQL RDS (read-only) + SQLite + Chart.js
Deploy: Docker Compose + Netbird VPN
Auth: bcrypt + express-session (3 roles: admin/corporate/agente)
Critical Problems to Solve
| Problem | Risk Level | Impact |
|---|---|---|
| Direct queries against production RDS | CRITICAL | Performance degradation, no historical data |
queries.js = 1,786 lines monolith |
HIGH | Unmaintainable, hard to test |
admin-bi.js = 2,133 lines inline HTML/CSS/JS |
HIGH | No component reuse, can't unit test |
| No alerting system | HIGH | Issues discovered manually, delayed response |
| No compliance/AML monitoring | CRITICAL | Regulatory exposure for forex platform |
| No ETL pipeline | HIGH | No snapshots, no transformation layer |
| No export/scheduled reports | MEDIUM | Manual report generation for stakeholders |
2. Phase 1 — Quick Wins (Weeks 1-4)
2.1 Alerting System
Goal: Automated notifications when KPIs breach thresholds.
Stack: node-cron + Slack Webhooks + Nodemailer
File: src/alerts/alert-engine.js
// src/alerts/alert-engine.js
const cron = require('node-cron');
const { getPool } = require('../db-rds');
const { sendSlack, sendEmail } = require('./channels');
const ALERTS = [
{
name: 'volume-spike',
cron: '*/30 * * * *', // every 30 min
query: `
SELECT
COUNT(*) as current_count,
(SELECT COUNT(*) / 7 FROM br_transaction_to_usa
WHERE created_at >= DATE_SUB(NOW(), INTERVAL 7 DAY)
AND created_at < DATE_SUB(NOW(), INTERVAL 30 MINUTE)) as avg_count
FROM br_transaction_to_usa
WHERE created_at >= DATE_SUB(NOW(), INTERVAL 30 MINUTE)
`,
condition: (row) => row.current_count > row.avg_count * 1.5 || row.current_count < row.avg_count * 0.5,
severity: 'P0',
channels: ['slack', 'email'],
message: (row) => `Volume Alert: ${row.current_count} txs in last 30min (avg: ${Math.round(row.avg_count)})`
},
{
name: 'spread-anomaly',
cron: '*/15 * * * *',
query: `
SELECT
AVG(exchange_rate - ptax) as current_spread,
(SELECT AVG(exchange_rate - ptax) FROM br_transaction_to_usa
WHERE status = 'finalizado'
AND created_at >= DATE_SUB(NOW(), INTERVAL 7 DAY)) as avg_spread,
(SELECT STDDEV(exchange_rate - ptax) FROM br_transaction_to_usa
WHERE status = 'finalizado'
AND created_at >= DATE_SUB(NOW(), INTERVAL 7 DAY)) as stddev_spread
FROM br_transaction_to_usa
WHERE status = 'finalizado'
AND created_at >= DATE_SUB(NOW(), INTERVAL 15 MINUTE)
HAVING COUNT(*) > 0
`,
condition: (row) => Math.abs(row.current_spread - row.avg_spread) > 2 * row.stddev_spread,
severity: 'P0',
channels: ['slack'],
message: (row) => `Spread Anomaly: Current ${row.current_spread.toFixed(4)} vs avg ${row.avg_spread.toFixed(4)} (±2σ = ${(row.stddev_spread * 2).toFixed(4)})`
},
{
name: 'provider-down',
cron: '0 */4 * * *', // every 4 hours
query: `
SELECT pm.provider,
MAX(t.created_at) as last_tx,
TIMESTAMPDIFF(HOUR, MAX(t.created_at), NOW()) as hours_since
FROM br_payment_methods pm
LEFT JOIN br_transaction_to_usa t ON t.payment_method_id = pm.id
AND t.created_at >= DATE_SUB(NOW(), INTERVAL 24 HOUR)
GROUP BY pm.provider
HAVING hours_since > 4 OR last_tx IS NULL
`,
condition: (rows) => rows.length > 0,
severity: 'P0',
channels: ['slack', 'email'],
message: (rows) => `Provider Down: ${rows.map(r => `${r.provider} (${r.hours_since}h)`).join(', ')}`
},
{
name: 'large-transaction',
cron: '*/5 * * * *',
query: `
SELECT t.id_transaction_to_usa, t.amount_usd, c.nome, t.created_at
FROM br_transaction_to_usa t
JOIN conta c ON c.id_conta = t.id_conta
WHERE t.amount_usd > 50000
AND t.created_at >= DATE_SUB(NOW(), INTERVAL 5 MINUTE)
AND t.status = 'finalizado'
`,
condition: (rows) => rows.length > 0,
severity: 'P1',
channels: ['slack'],
message: (rows) => `Large Tx: ${rows.map(r => `$${r.amount_usd.toLocaleString()} from ${r.nome}`).join('; ')}`
},
{
name: 'failed-tx-spike',
cron: '*/15 * * * *',
query: `
SELECT
SUM(CASE WHEN status IN ('refunded','on-hold') THEN 1 ELSE 0 END) as failed,
COUNT(*) as total,
SUM(CASE WHEN status IN ('refunded','on-hold') THEN 1 ELSE 0 END) / COUNT(*) * 100 as fail_rate
FROM br_transaction_to_usa
WHERE created_at >= DATE_SUB(NOW(), INTERVAL 1 HOUR)
HAVING total > 0
`,
condition: (row) => row.fail_rate > 5,
severity: 'P0',
channels: ['slack', 'email'],
message: (row) => `Failed Tx Spike: ${row.fail_rate.toFixed(1)}% failure rate (${row.failed}/${row.total}) in last hour`
}
];
// Slack channel sender
async function sendSlack(webhook, message, severity) {
const colors = { P0: '#ef4444', P1: '#f59e0b', P2: '#3b82f6' };
await fetch(webhook, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
attachments: [{
color: colors[severity] || '#94a3b8',
title: `BI-CCC Alert [${severity}]`,
text: message,
ts: Math.floor(Date.now() / 1000)
}]
})
});
}
module.exports = { ALERTS, initAlerts };
Env vars to add:
SLACK_WEBHOOK_URL=https://hooks.slack.com/services/xxx/yyy/zzz
ALERT_EMAIL_TO=ops@cambioreal.com
SMTP_HOST=smtp.gmail.com
SMTP_USER=alerts@cambioreal.com
SMTP_PASS=app-password
2.2 PDF/Excel Export
Goal: Add export buttons to all dashboards.
Stack: Already have puppeteer and exceljs in querycambio — move to shared dependency.
File: src/export/report-generator.js
// src/export/report-generator.js
const puppeteer = require('puppeteer');
const ExcelJS = require('exceljs');
async function exportDashboardToPDF(url, sessionCookie) {
const browser = await puppeteer.launch({
headless: true,
args: ['--no-sandbox']
});
const page = await browser.newPage();
await page.setCookie({ name: 'connect.sid', value: sessionCookie, domain: 'localhost' });
await page.goto(url, { waitUntil: 'networkidle0' });
// Hide nav, expand all sections
await page.evaluate(() => {
document.querySelector('.console-nav')?.remove();
document.querySelector('header')?.remove();
});
const pdf = await page.pdf({
format: 'A4',
landscape: true,
printBackground: true,
margin: { top: '1cm', bottom: '1cm', left: '1cm', right: '1cm' }
});
await browser.close();
return pdf;
}
async function exportToExcel(data, columns, sheetName = 'Report') {
const workbook = new ExcelJS.Workbook();
const sheet = workbook.addWorksheet(sheetName);
sheet.columns = columns.map(col => ({
header: col.label,
key: col.key,
width: col.width || 15,
style: col.style || {}
}));
// Header styling
sheet.getRow(1).font = { bold: true, size: 11 };
sheet.getRow(1).fill = { type: 'pattern', pattern: 'solid', fgColor: { argb: '7600BE' } };
sheet.getRow(1).font = { bold: true, color: { argb: 'FFFFFF' } };
data.forEach(row => sheet.addRow(row));
return workbook;
}
module.exports = { exportDashboardToPDF, exportToExcel };
API endpoints to add in server.js:
// Export endpoints
app.get('/admin/api/export/bi-pdf', requireRole('admin'), async (req, res) => {
const { start, end } = req.query;
const pdf = await exportDashboardToPDF(
`http://localhost:${PORT}/admin/bi?start=${start}&end=${end}`,
req.cookies['connect.sid']
);
res.setHeader('Content-Type', 'application/pdf');
res.setHeader('Content-Disposition', `attachment; filename=BI-Report-${start}-${end}.pdf`);
res.send(pdf);
});
app.get('/admin/api/export/clients-excel', requireRole('admin'), async (req, res) => {
const data = await queries.getTopClients();
const workbook = await exportToExcel(data, [
{ key: 'nome', label: 'Client', width: 30 },
{ key: 'volume_usd', label: 'Volume (USD)', width: 18 },
{ key: 'transactions', label: 'Transactions', width: 15 },
{ key: 'revenue', label: 'Revenue (USD)', width: 18 },
{ key: 'health_score', label: 'Health Score', width: 15 }
], 'Top Clients');
res.setHeader('Content-Type', 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet');
res.setHeader('Content-Disposition', 'attachment; filename=top-clients.xlsx');
await workbook.xlsx.write(res);
});
2.3 Provider Performance Dashboard
Goal: Compare all payin/payout providers in one view.
File: src/admin-providers.js
Query: src/queries/provider-queries.js
-- Provider performance (payin)
SELECT
pm.provider,
COUNT(*) as total_transactions,
SUM(CASE WHEN t.status = 'finalizado' THEN 1 ELSE 0 END) as completed,
SUM(CASE WHEN t.status = 'refunded' THEN 1 ELSE 0 END) as refunded,
ROUND(SUM(CASE WHEN t.status = 'finalizado' THEN 1 ELSE 0 END) / COUNT(*) * 100, 2) as success_rate,
ROUND(SUM(t.amount_usd), 2) as total_volume_usd,
ROUND(AVG(t.amount_usd), 2) as avg_ticket,
ROUND(AVG(TIMESTAMPDIFF(HOUR, t.created_at, t.date_received_br)), 1) as avg_settlement_hours,
ROUND(AVG(t.exchange_rate - t.ptax), 4) as avg_spread
FROM br_transaction_to_usa t
JOIN br_payment_methods pm ON pm.id = t.payment_method_id
WHERE t.created_at BETWEEN ? AND ?
GROUP BY pm.provider
ORDER BY total_volume_usd DESC;
-- Provider performance (payout)
SELECT
ex.provider,
COUNT(et.id) as total_transactions,
SUM(CASE WHEN et.status = 'completed' THEN 1 ELSE 0 END) as completed,
ROUND(SUM(CASE WHEN et.status = 'completed' THEN 1 ELSE 0 END) / COUNT(*) * 100, 2) as success_rate,
ROUND(SUM(et.amount_brl), 2) as total_volume_brl
FROM pagamento_br_export_transactions et
JOIN pagamento_br_exports ex ON ex.id = et.export_id
WHERE et.created_at BETWEEN ? AND ?
GROUP BY ex.provider
ORDER BY total_volume_brl DESC;
2.4 Failed Transaction Analysis
File: src/admin-failed-transactions.js
-- Failed transactions breakdown
SELECT
t.status,
pm.provider,
COUNT(*) as count,
ROUND(SUM(t.amount_usd), 2) as volume_lost,
DATE(t.created_at) as date
FROM br_transaction_to_usa t
JOIN br_payment_methods pm ON pm.id = t.payment_method_id
WHERE t.status IN ('refunded', 'on-hold')
AND t.created_at BETWEEN ? AND ?
GROUP BY t.status, pm.provider, DATE(t.created_at)
ORDER BY date DESC, count DESC;
2.5 New KPIs for Corporate Dashboard
Add these to GET /corporate/api/kpis:
-- Transaction Success Rate
SELECT
ROUND(SUM(CASE WHEN status = 'finalizado' THEN 1 ELSE 0 END) / COUNT(*) * 100, 2) as success_rate
FROM br_transaction_to_usa
WHERE created_at BETWEEN ? AND ?;
-- Refund Rate
SELECT
ROUND(SUM(CASE WHEN status = 'refunded' THEN 1 ELSE 0 END) / COUNT(*) * 100, 2) as refund_rate
FROM br_transaction_to_usa
WHERE created_at BETWEEN ? AND ?;
-- Gross Take Rate
-- (Total Revenue / Total Volume)
-- Use existing revenue calculation, divide by total volume
3. Phase 2 — Data Architecture (Weeks 5-10)
3.1 Analytics Database Setup
Recommended: ClickHouse (best price/performance for analytical queries)
Alternative: TimescaleDB on PostgreSQL (easier migration path)
Option A: ClickHouse
# docker-compose.yml addition
services:
clickhouse:
image: clickhouse/clickhouse-server:24.1
container_name: bi-clickhouse
ports:
- "8123:8123" # HTTP
- "9000:9000" # Native
volumes:
- clickhouse-data:/var/lib/clickhouse
environment:
CLICKHOUSE_DB: bi_analytics
CLICKHOUSE_USER: bi_user
CLICKHOUSE_PASSWORD: ${CLICKHOUSE_PASSWORD}
ulimits:
nofile:
soft: 262144
hard: 262144
volumes:
clickhouse-data:
ClickHouse Schema
-- Transactions fact table (payin)
CREATE TABLE bi_analytics.fact_payin (
id UInt64,
created_at DateTime,
date_received DateTime,
id_conta UInt64,
client_name String,
agente_id Nullable(UInt32),
amount_brl Decimal(18,2),
amount_usd Decimal(18,2),
exchange_rate Decimal(12,6),
ptax Decimal(12,6),
spread Decimal(12,6),
fee Decimal(12,2),
iof Decimal(12,2),
pfee Decimal(12,2),
bonus_valor Decimal(12,2),
taxa_cr Decimal(12,2),
revenue Decimal(12,2),
provider String,
status String,
tipo_envio String,
tipo_pagamento String,
is_checkout UInt8,
merchant_id Nullable(UInt64),
merchant_name Nullable(String),
-- Calculated fields
spread_pct Decimal(8,4),
settlement_hours Nullable(Float32)
) ENGINE = MergeTree()
ORDER BY (created_at, id_conta)
PARTITION BY toYYYYMM(created_at);
-- Transactions fact table (payout)
CREATE TABLE bi_analytics.fact_payout (
id UInt64,
created_at DateTime,
id_conta UInt64,
sender_conta_id Nullable(UInt64),
client_name String,
valor_usd Decimal(18,2),
valor_brl Decimal(18,2),
ptax Decimal(12,6),
cotacao Decimal(12,6),
revenue Decimal(12,2),
tipo_envio String,
provider String,
status String
) ENGINE = MergeTree()
ORDER BY (created_at, id_conta)
PARTITION BY toYYYYMM(created_at);
-- Daily snapshots (materialized)
CREATE TABLE bi_analytics.daily_metrics (
date Date,
product String, -- 'cambioPay', 'cambioPayReverso', 'cambioCheckout'
total_transactions UInt32,
completed_transactions UInt32,
failed_transactions UInt32,
total_volume_usd Decimal(18,2),
total_volume_brl Decimal(18,2),
total_revenue Decimal(18,2),
avg_spread Decimal(8,4),
avg_ticket_usd Decimal(12,2),
unique_clients UInt32,
new_clients UInt32,
avg_settlement_hours Nullable(Float32)
) ENGINE = MergeTree()
ORDER BY (date, product);
-- Client health snapshots
CREATE TABLE bi_analytics.client_health_daily (
date Date,
id_conta UInt64,
client_name String,
health_score UInt8,
churn_risk String, -- 'low','medium','high','critical'
lifetime_volume Decimal(18,2),
lifetime_revenue Decimal(18,2),
lifetime_transactions UInt32,
last_transaction_date Nullable(Date),
days_since_last_tx UInt16,
monthly_velocity Decimal(12,2)
) ENGINE = MergeTree()
ORDER BY (date, id_conta)
PARTITION BY toYYYYMM(date);
3.2 ETL Pipeline
File: src/etl/pipeline.js
// src/etl/pipeline.js
const cron = require('node-cron');
const mysql = require('../db-rds');
const clickhouse = require('../db-clickhouse');
const logger = require('../etl/logger');
const BATCH_SIZE = 5000;
class ETLPipeline {
// Run every hour — sync new transactions
async syncPayinTransactions() {
const jobId = logger.startJob('sync-payin');
try {
// Get last synced ID
const [lastRow] = await clickhouse.query(
'SELECT max(id) as last_id FROM bi_analytics.fact_payin'
);
const lastId = lastRow?.last_id || 0;
// Fetch new records from MySQL
const [rows] = await mysql.query(`
SELECT
t.id_transaction_to_usa as id,
t.created_at,
t.date_received_br as date_received,
t.id_conta,
c.nome as client_name,
ac.agente_id,
t.amount_brl,
t.amount_usd,
t.exchange_rate,
t.ptax,
(t.exchange_rate - t.ptax) as spread,
t.fee,
t.iof,
COALESCE(t.pfee, 0) as pfee,
COALESCE(t.bonus_valor, 0) as bonus_valor,
COALESCE(t.taxa_cr, 0) as taxa_cr,
pm.provider,
t.status,
t.tipo_envio,
t.tipo_pagamento,
CASE WHEN t.cobranca_id IS NOT NULL THEN 1 ELSE 0 END as is_checkout,
e.id as merchant_id,
e.nome_empresa as merchant_name
FROM br_transaction_to_usa t
JOIN conta c ON c.id_conta = t.id_conta
LEFT JOIN ag_contas ac ON ac.conta_id = t.id_conta
LEFT JOIN br_payment_methods pm ON pm.id = t.payment_method_id
LEFT JOIN br_cb_cobranca cb ON cb.id = t.cobranca_id
LEFT JOIN br_cb_empresas e ON e.id = cb.empresa_id
WHERE t.id_transaction_to_usa > ?
ORDER BY t.id_transaction_to_usa
LIMIT ?
`, [lastId, BATCH_SIZE]);
if (rows.length === 0) {
logger.endJob(jobId, 'success', 0);
return;
}
// Calculate revenue for each row
const enriched = rows.map(row => ({
...row,
revenue: calculatePayinRevenue(row),
spread_pct: row.ptax > 0
? ((row.exchange_rate - row.ptax) / row.exchange_rate * 100)
: 0,
settlement_hours: row.date_received
? (new Date(row.date_received) - new Date(row.created_at)) / 3600000
: null
}));
// Insert into ClickHouse
await clickhouse.insert('bi_analytics.fact_payin', enriched);
logger.endJob(jobId, 'success', rows.length);
} catch (err) {
logger.endJob(jobId, 'error', 0, err.message);
throw err;
}
}
// Run daily at 1 AM — build daily aggregates
async buildDailyMetrics(date) {
const jobId = logger.startJob('daily-metrics');
const targetDate = date || new Date(Date.now() - 86400000).toISOString().split('T')[0];
await clickhouse.query(`
INSERT INTO bi_analytics.daily_metrics
SELECT
toDate(created_at) as date,
CASE
WHEN is_checkout = 1 THEN 'cambioCheckout'
ELSE 'cambioPay'
END as product,
count() as total_transactions,
countIf(status = 'finalizado') as completed_transactions,
countIf(status IN ('refunded', 'on-hold')) as failed_transactions,
sum(amount_usd) as total_volume_usd,
sum(amount_brl) as total_volume_brl,
sum(revenue) as total_revenue,
avg(spread_pct) as avg_spread,
avg(amount_usd) as avg_ticket_usd,
uniq(id_conta) as unique_clients,
uniqIf(id_conta, /* first tx logic */) as new_clients,
avg(settlement_hours) as avg_settlement_hours
FROM bi_analytics.fact_payin
WHERE toDate(created_at) = '${targetDate}'
GROUP BY date, product
`);
logger.endJob(jobId, 'success');
}
}
// Revenue calculation (mirrors existing logic)
function calculatePayinRevenue(row) {
const noFeeProviders = ['ouribank', 'bs2'];
const feeAjustado = noFeeProviders.includes(row.provider) ? 0 : (row.fee || 0);
if (row.ptax <= 0) return 0;
const revenue = (row.amount_brl - feeAjustado) / row.ptax
- (row.pfee || 0)
- (row.amount_usd + (row.bonus_valor || 0) - (row.taxa_cr || 0));
return Math.round(revenue * 100) / 100;
}
module.exports = ETLPipeline;
Cron schedule (add to server.js or separate worker):
const ETLPipeline = require('./src/etl/pipeline');
const etl = new ETLPipeline();
// Hourly: sync new transactions
cron.schedule('0 * * * *', () => etl.syncPayinTransactions());
cron.schedule('5 * * * *', () => etl.syncPayoutTransactions());
// Daily at 1 AM: build aggregates
cron.schedule('0 1 * * *', () => etl.buildDailyMetrics());
// Daily at 2 AM: build client health snapshots
cron.schedule('0 2 * * *', () => etl.buildClientHealthSnapshots());
3.3 Data Quality Monitoring
File: src/etl/data-quality.js
// Run after every ETL job
async function validateDataQuality() {
const checks = [];
// 1. Row count comparison
const [mysqlCount] = await mysql.query(
'SELECT COUNT(*) as cnt FROM br_transaction_to_usa WHERE status = "finalizado"'
);
const [chCount] = await clickhouse.query(
"SELECT count() as cnt FROM bi_analytics.fact_payin WHERE status = 'finalizado'"
);
checks.push({
name: 'row-count-payin',
expected: mysqlCount.cnt,
actual: chCount.cnt,
pass: Math.abs(mysqlCount.cnt - chCount.cnt) < 100 // tolerance for in-flight
});
// 2. Null checks on critical fields
const [nullCheck] = await clickhouse.query(`
SELECT
countIf(ptax = 0 OR ptax IS NULL) as null_ptax,
countIf(amount_brl = 0) as zero_amount,
countIf(provider = '') as empty_provider
FROM bi_analytics.fact_payin
WHERE created_at >= today() - 1
`);
checks.push({
name: 'null-critical-fields',
pass: nullCheck.null_ptax === 0 && nullCheck.zero_amount === 0
});
// 3. Revenue reconciliation
const [revCheck] = await clickhouse.query(`
SELECT
abs(sum(revenue) - ${expectedRevenue}) / ${expectedRevenue} * 100 as deviation_pct
FROM bi_analytics.fact_payin
WHERE toDate(created_at) = yesterday()
`);
checks.push({
name: 'revenue-reconciliation',
pass: revCheck.deviation_pct < 1 // less than 1% deviation
});
return checks;
}
4. Phase 3 — Compliance & Predictive Analytics (Weeks 11-16)
4.1 AML Transaction Monitoring
File: src/compliance/aml-monitor.js
New route: GET /admin/compliance
-- Threshold monitoring: transactions > $3,000
SELECT
c.nome, t.id_conta, t.amount_usd, t.created_at, pm.provider
FROM br_transaction_to_usa t
JOIN conta c ON c.id_conta = t.id_conta
JOIN br_payment_methods pm ON pm.id = t.payment_method_id
WHERE t.amount_usd > 3000
AND t.status = 'finalizado'
AND t.created_at BETWEEN ? AND ?
ORDER BY t.amount_usd DESC;
-- Structuring detection: multiple transactions just below threshold per client per day
SELECT
c.nome,
t.id_conta,
DATE(t.created_at) as tx_date,
COUNT(*) as tx_count,
SUM(t.amount_usd) as daily_total,
AVG(t.amount_usd) as avg_amount,
MIN(t.amount_usd) as min_amount,
MAX(t.amount_usd) as max_amount
FROM br_transaction_to_usa t
JOIN conta c ON c.id_conta = t.id_conta
WHERE t.status = 'finalizado'
AND t.amount_usd BETWEEN 2500 AND 3000
AND t.created_at BETWEEN ? AND ?
GROUP BY t.id_conta, c.nome, DATE(t.created_at)
HAVING tx_count >= 2
ORDER BY daily_total DESC;
-- Unusual velocity: clients with transaction frequency spike
SELECT
c.nome,
t.id_conta,
COUNT(*) as recent_count,
(SELECT COUNT(*) FROM br_transaction_to_usa t2
WHERE t2.id_conta = t.id_conta
AND t2.created_at BETWEEN DATE_SUB(?, INTERVAL 30 DAY) AND ?) / 30 as daily_avg,
SUM(t.amount_usd) as recent_volume
FROM br_transaction_to_usa t
JOIN conta c ON c.id_conta = t.id_conta
WHERE t.status = 'finalizado'
AND t.created_at BETWEEN ? AND ?
GROUP BY t.id_conta, c.nome
HAVING recent_count > daily_avg * 3
ORDER BY recent_count DESC;
4.2 IOF Reconciliation Dashboard
-- IOF verification: compare calculated vs expected
SELECT
DATE(t.created_at) as date,
pm.provider,
COUNT(*) as transactions,
SUM(t.iof) as total_iof_charged,
SUM(t.amount_usd * t.exchange_rate * 0.0038) as expected_iof_standard,
SUM(t.iof) - SUM(t.amount_usd * t.exchange_rate * 0.0038) as iof_delta,
ROUND(
ABS(SUM(t.iof) - SUM(t.amount_usd * t.exchange_rate * 0.0038))
/ NULLIF(SUM(t.amount_usd * t.exchange_rate * 0.0038), 0) * 100, 2
) as delta_pct
FROM br_transaction_to_usa t
JOIN br_payment_methods pm ON pm.id = t.payment_method_id
WHERE t.status = 'finalizado'
AND t.created_at BETWEEN ? AND ?
GROUP BY DATE(t.created_at), pm.provider
HAVING delta_pct > 0.5
ORDER BY date DESC;
4.3 Volume Forecasting
Stack: Python microservice with Prophet or Node.js with timeseries-analysis
File: src/analytics/forecast.js (Node approach)
// Simple exponential smoothing for Node.js
// For production, consider a Python sidecar with Prophet
function exponentialSmoothing(data, alpha = 0.3) {
const result = [data[0]];
for (let i = 1; i < data.length; i++) {
result.push(alpha * data[i] + (1 - alpha) * result[i - 1]);
}
return result;
}
function forecast(historicalData, periodsAhead = 30) {
// historicalData = [{ date, value }, ...]
const values = historicalData.map(d => d.value);
const smoothed = exponentialSmoothing(values);
// Simple trend projection
const recent = smoothed.slice(-7);
const trend = (recent[recent.length - 1] - recent[0]) / recent.length;
const lastValue = smoothed[smoothed.length - 1];
const predictions = [];
for (let i = 1; i <= periodsAhead; i++) {
const lastDate = new Date(historicalData[historicalData.length - 1].date);
lastDate.setDate(lastDate.getDate() + i);
predictions.push({
date: lastDate.toISOString().split('T')[0],
predicted: Math.max(0, lastValue + trend * i),
lower: Math.max(0, lastValue + trend * i - lastValue * 0.15),
upper: lastValue + trend * i + lastValue * 0.15
});
}
return predictions;
}
module.exports = { forecast };
API endpoint:
app.get('/admin/api/bi/forecast', requireRole('admin'), async (req, res) => {
const { metric = 'volume', days = 30 } = req.query;
// Get 90 days of historical data from analytics DB
const historical = await clickhouse.query(`
SELECT date, sum(total_volume_usd) as value
FROM bi_analytics.daily_metrics
WHERE date >= today() - 90
GROUP BY date
ORDER BY date
`);
const predictions = forecast(historical, parseInt(days));
res.json({ historical, predictions });
});
4.4 Churn Prediction Model
Evolve the static Health Score into a weighted model:
function predictChurnRisk(client) {
const weights = {
recency: 0.30, // days since last transaction
frequency: 0.25, // transactions per month trend
monetary: 0.20, // volume trend (growing/declining)
velocity: 0.15, // rate of change in transaction frequency
engagement: 0.10 // product diversity, ticket variation
};
// Recency score (0-100, higher = less risk)
const recencyScore = Math.max(0, 100 - (client.days_since_last_tx * 2));
// Frequency score: compare last 30d vs previous 30d
const freqRatio = client.tx_count_30d / Math.max(client.tx_count_prev_30d, 1);
const frequencyScore = Math.min(100, freqRatio * 50);
// Monetary score: volume growth
const volRatio = client.volume_30d / Math.max(client.volume_prev_30d, 1);
const monetaryScore = Math.min(100, volRatio * 50);
// Velocity: acceleration of transaction frequency
const velocityScore = client.tx_acceleration > 0 ? 70 + client.tx_acceleration * 10 : 30;
// Engagement: uses multiple products?
const engagementScore = client.products_used * 33;
const totalScore =
recencyScore * weights.recency +
frequencyScore * weights.frequency +
monetaryScore * weights.monetary +
Math.min(100, velocityScore) * weights.velocity +
Math.min(100, engagementScore) * weights.engagement;
return {
score: Math.round(totalScore),
risk: totalScore < 25 ? 'critical' : totalScore < 45 ? 'high' : totalScore < 65 ? 'medium' : 'low',
factors: {
recency: { score: recencyScore, weight: weights.recency },
frequency: { score: frequencyScore, weight: weights.frequency },
monetary: { score: monetaryScore, weight: weights.monetary },
velocity: { score: Math.min(100, velocityScore), weight: weights.velocity },
engagement: { score: Math.min(100, engagementScore), weight: weights.engagement }
}
};
}
5. Phase 4 — Platform Maturity (Weeks 17-24)
5.1 Scheduled Report Generation
File: src/reports/scheduler.js
const cron = require('node-cron');
const { exportDashboardToPDF, exportToExcel } = require('../export/report-generator');
const { sendEmail } = require('../alerts/channels');
const REPORT_SCHEDULE = [
{
name: 'Daily Executive Digest',
cron: '0 8 * * 1-5', // 8 AM weekdays
recipients: ['exec@cambioreal.com'],
generator: async () => {
const data = await queries.getDailyDigest();
return {
subject: `BI Daily Digest — ${new Date().toLocaleDateString()}`,
html: buildDigestHTML(data),
attachments: [] // inline, no PDF
};
}
},
{
name: 'Weekly Board Summary',
cron: '0 9 * * 1', // Monday 9 AM
recipients: ['board@cambioreal.com'],
generator: async () => {
const end = new Date().toISOString().split('T')[0];
const start = new Date(Date.now() - 7 * 86400000).toISOString().split('T')[0];
const pdf = await exportDashboardToPDF(
`http://localhost:3080/admin/bi?start=${start}&end=${end}`
);
return {
subject: `BI Weekly Report — Week of ${start}`,
html: '<p>Weekly BI report attached.</p>',
attachments: [{ filename: `BI-Weekly-${start}.pdf`, content: pdf }]
};
}
},
{
name: 'Monthly Agent Commissions',
cron: '0 10 1 * *', // 1st of month, 10 AM
recipients: ['finance@cambioreal.com'],
generator: async () => {
const data = await queries.getMonthlyCommissions();
const workbook = await exportToExcel(data, commissionColumns, 'Commissions');
const buffer = await workbook.xlsx.writeBuffer();
return {
subject: `Agent Commissions — ${new Date().toLocaleDateString('en', { month: 'long', year: 'numeric' })}`,
html: '<p>Monthly commission report attached.</p>',
attachments: [{ filename: 'commissions.xlsx', content: buffer }]
};
}
}
];
5.2 Agent Commission Integration
Move commission calculation from separate scripts (comissao-valorfx/, comissao-d4u/) into BI-CCC:
New route: GET /admin/commissions
Features:
- Auto-calculate commissions per agent per period
- Approval workflow (admin reviews → approves → generates Excel)
- Commission dispute tracking
- Agent ROI: commission paid vs. revenue generated by agent's clients
// src/queries/commission-queries.js
// ValorFx (Agent 76) — 50/50 rateio
async function calculateValorFxCommission(start, end) {
const [rows] = await pool.query(`
SELECT
t.id_transaction_to_usa,
t.created_at,
c.nome as client_name,
t.amount_brl,
t.amount_usd,
t.exchange_rate,
t.ptax,
t.fee,
t.iof,
t.pfee,
t.bonus_valor,
t.taxa_cr,
pm.provider
FROM br_transaction_to_usa t
JOIN conta c ON c.id_conta = t.id_conta
JOIN ag_contas ac ON ac.conta_id = t.id_conta
JOIN br_payment_methods pm ON pm.id = t.payment_method_id
WHERE ac.agente_id = 76
AND t.status = 'finalizado'
AND t.created_at BETWEEN ? AND ?
ORDER BY t.created_at
`, [start, end]);
return rows.map(row => {
const noFeeProviders = ['ouribank', 'bs2'];
const feeAdj = noFeeProviders.includes(row.provider) ? 0 : row.fee;
const spreadLiqUsd = (row.amount_brl - feeAdj) / row.ptax
- row.pfee - row.amount_usd + row.bonus_valor - row.taxa_cr;
const iofEficiencia = row.provider === 'RIPPLE'
? row.iof
: row.iof * 0.425;
const rateio = (iofEficiencia + spreadLiqUsd) / 2;
return { ...row, spreadLiqUsd, iofEficiencia, rateio };
});
}
// D4U — 20% do spread
async function calculateD4UCommission(start, end) {
const d4uContas = [257174, 333262, 349175, 352427];
// ... similar structure, 20% of revenue
}
5.3 Frontend Migration Path
Recommended approach: Gradual migration using htmx + Alpine.js
Step 1: Extract UI Template Components
src/
├── components/
│ ├── kpi-card.js // buildKPICard(title, value, change, color)
│ ├── chart-wrapper.js // buildChartWrapper(id, type, config)
│ ├── data-table.js // buildDataTable(columns, data, options)
│ ├── filter-bar.js // buildFilterBar(filters)
│ └── layout.js // buildPageLayout(title, nav, content)
Step 2: Add htmx for Progressive Enhancement
<!-- Instead of full-page reload, use htmx for partial updates -->
<div hx-get="/admin/api/bi/kpis?start=${start}&end=${end}"
hx-trigger="load, change from:#date-filter"
hx-target="#kpi-grid"
hx-swap="innerHTML">
Loading...
</div>
Step 3: Introduce Alpine.js for Client Interactivity
<div x-data="{ tab: 'overview' }">
<button @click="tab = 'overview'" :class="{ active: tab === 'overview' }">Overview</button>
<button @click="tab = 'revenue'" :class="{ active: tab === 'revenue' }">Revenue</button>
<div x-show="tab === 'overview'">...</div>
<div x-show="tab === 'revenue'">...</div>
</div>
5.4 Multi-Language Support
File: src/i18n/index.js
const translations = {
en: {
'nav.dashboard': 'Dashboard',
'nav.bi': 'BI Executive',
'nav.clients': 'Clients',
'kpi.volume': 'Volume',
'kpi.transactions': 'Transactions',
'kpi.revenue': 'Revenue',
'kpi.avgTicket': 'Avg Ticket',
'kpi.spread': 'Spread',
'kpi.healthScore': 'Health Score',
'filter.period': 'Period',
'filter.from': 'From',
'filter.to': 'To',
'status.completed': 'Completed',
'status.refunded': 'Refunded',
'status.pending': 'Pending'
},
pt: {
'nav.dashboard': 'Painel',
'nav.bi': 'BI Executivo',
'nav.clients': 'Clientes',
'kpi.volume': 'Volume',
'kpi.transactions': 'Transações',
'kpi.revenue': 'Receita',
'kpi.avgTicket': 'Ticket Médio',
'kpi.spread': 'Spread',
'kpi.healthScore': 'Score de Saúde',
'filter.period': 'Período',
'filter.from': 'De',
'filter.to': 'Até',
'status.completed': 'Finalizado',
'status.refunded': 'Estornado',
'status.pending': 'Pendente'
}
};
function t(key, locale = 'en') {
return translations[locale]?.[key] || translations.en[key] || key;
}
module.exports = { t, translations };
6. New KPIs to Implement
Financial KPIs
| KPI | SQL/Formula | Priority | Add To |
|---|---|---|---|
| Net Revenue Retention | (Start MRR + Expansion - Contraction - Churn) / Start MRR |
P0 | BI Executive |
| Revenue per Transaction | SUM(revenue) / COUNT(*) |
P0 | Corporate |
| Gross Take Rate | SUM(revenue) / SUM(volume_usd) |
P0 | BI Executive |
| Provider Cost per $1K | SUM(provider_fees) / (SUM(volume) / 1000) |
P1 | Provider Dashboard |
| LTV | AVG(monthly_revenue) × AVG(lifespan_months) |
P1 | Client 360 |
| Revenue Concentration (HHI) | SUM(client_share^2) |
P1 | BI Executive |
| MRR | SUM(revenue) WHERE client active in consecutive months |
P2 | BI Executive |
Operational KPIs
| KPI | SQL | Priority | Add To |
|---|---|---|---|
| Transaction Success Rate | SUM(IF status='finalizado',1,0) / COUNT(*) |
P0 | Corporate |
| Avg Settlement Time | AVG(TIMESTAMPDIFF(HOUR, created_at, date_received_br)) |
P0 | Provider Dashboard |
| Provider Uptime | COUNT(DISTINCT HOUR(created_at)) / 24 per provider |
P1 | Provider Dashboard |
| Refund Rate | SUM(IF status='refunded',1,0) / COUNT(*) |
P1 | Corporate |
| PTAX Deviation | AVG(ABS(exchange_rate - ptax) / ptax) |
P1 | BI Executive |
| Peak Hour Analysis | GROUP BY HOUR(created_at) |
P2 | Corporate |
Client KPIs
| KPI | Logic | Priority | Add To |
|---|---|---|---|
| Client Activation Rate | First-tx clients / New signups | P0 | BI Executive |
| Cross-Product Adoption | Clients using 2+ products / Total active | P1 | BI Executive |
| Time to First Transaction | AVG(first_tx - signup) |
P1 | BI Executive |
| Reactivation Rate | Previously churned → active again | P2 | Client 360 |
Compliance KPIs
| KPI | Logic | Priority | Add To |
|---|---|---|---|
| Threshold Tx Count | Transactions > $3K, $10K per client per day | P0 | Compliance |
| Structuring Detection | Multiple txs between $2.5K-$3K same day | P0 | Compliance |
| SAR Filing Rate | SARs filed per period | P0 | Compliance |
| IOF Reconciliation Delta | Calculated vs expected IOF | P1 | Compliance |
| Cross-Border Volume per Corridor | Volume by origin/destination | P1 | Compliance |
7. Architecture Reference
Current Architecture
Browser → Express (SSR HTML + inline JS/CSS) → MySQL RDS (direct)
→ SQLite (auth)
Target Architecture
Browser (htmx + Alpine.js + Chart.js)
│
▼
Express API Gateway
│
├── REST API layer (JSON)
│ ├── /api/bi/* → BI Service
│ ├── /api/clients/* → Client Service
│ ├── /api/providers/* → Provider Service
│ ├── /api/compliance/* → Compliance Service
│ ├── /api/agents/* → Agent Service
│ └── /api/export/* → Export Service
│
├── SSR HTML layer (pages)
│ └── Uses components/ for reusable UI blocks
│
├── Analytics DB (ClickHouse)
│ └── fact tables + daily aggregates + client snapshots
│
├── ETL Worker (node-cron)
│ ├── Hourly: sync transactions
│ ├── Daily: build aggregates + snapshots
│ └── Data quality checks
│
├── Alert Engine (node-cron)
│ ├── Volume/spread monitors
│ ├── Provider health checks
│ └── Channels: Slack + Email
│
├── Report Generator (Puppeteer + ExcelJS)
│ ├── Scheduled: daily/weekly/monthly
│ └── On-demand: PDF/Excel export
│
├── MySQL RDS (source — read-only via VPN)
│
└── SQLite (auth + user preferences)
New File Structure
bi-agentes/
├── server.js # Route registration + middleware only
├── src/
│ ├── routes/
│ │ ├── bi.routes.js # /admin/bi, /admin/api/bi
│ │ ├── client.routes.js # /admin/cliente, /admin/api/clientes
│ │ ├── corporate.routes.js # /corporate, /corporate/api
│ │ ├── provider.routes.js # /admin/providers (NEW)
│ │ ├── compliance.routes.js # /admin/compliance (NEW)
│ │ ├── commission.routes.js # /admin/commissions (NEW)
│ │ ├── export.routes.js # /admin/api/export (NEW)
│ │ ├── agent.routes.js # /dashboard
│ │ └── auth.routes.js # /login, /logout
│ │
│ ├── services/
│ │ ├── revenue.service.js # Revenue calculation logic
│ │ ├── client.service.js # Client health, churn, LTV
│ │ ├── provider.service.js # Provider analytics
│ │ ├── compliance.service.js # AML, threshold, structuring
│ │ ├── commission.service.js # Agent commissions
│ │ ├── forecast.service.js # Volume/revenue forecasting
│ │ └── metrics.service.js # KPI calculations
│ │
│ ├── queries/
│ │ ├── payin.queries.js # br_transaction_to_usa queries
│ │ ├── payout.queries.js # pagamento_br queries
│ │ ├── client.queries.js # Client-related queries
│ │ ├── provider.queries.js # Provider-related queries
│ │ ├── compliance.queries.js # AML/compliance queries
│ │ └── commission.queries.js # Commission queries
│ │
│ ├── components/
│ │ ├── kpi-card.js
│ │ ├── chart-wrapper.js
│ │ ├── data-table.js
│ │ ├── filter-bar.js
│ │ ├── alert-badge.js
│ │ └── layout.js
│ │
│ ├── etl/
│ │ ├── pipeline.js # Main ETL orchestrator
│ │ ├── sync-payin.js # Payin transaction sync
│ │ ├── sync-payout.js # Payout transaction sync
│ │ ├── build-aggregates.js # Daily/weekly/monthly rollups
│ │ ├── build-snapshots.js # Client health snapshots
│ │ ├── data-quality.js # Validation checks
│ │ └── logger.js # ETL job logging
│ │
│ ├── alerts/
│ │ ├── alert-engine.js # Alert definitions + scheduler
│ │ ├── channels.js # Slack, Email, SMS senders
│ │ └── alert-rules.js # Alert configurations
│ │
│ ├── export/
│ │ ├── report-generator.js # PDF + Excel generation
│ │ └── scheduler.js # Scheduled report delivery
│ │
│ ├── i18n/
│ │ ├── index.js # Translation helper
│ │ ├── en.json
│ │ └── pt.json
│ │
│ ├── db-rds.js # MySQL connection (unchanged)
│ ├── db-clickhouse.js # ClickHouse connection (NEW)
│ ├── db-local.js # SQLite (unchanged)
│ ├── auth.js # Auth middleware (unchanged)
│ ├── cache.js # Cache layer (unchanged)
│ └── ui-template.js # Base template (refactored)
│
├── public/
│ ├── chart.umd.min.js
│ ├── htmx.min.js # NEW
│ ├── alpine.min.js # NEW
│ ├── login.html
│ └── assets/
│
├── data/
│ └── agentes.db
│
├── scripts/
│ ├── seed-admin.js
│ ├── seed-agente.js
│ ├── migrate-clickhouse.js # NEW: schema setup
│ └── backfill-analytics.js # NEW: historical data load
│
├── Dockerfile
├── docker-compose.yml # + clickhouse service
├── docker-entrypoint.sh
└── .env
8. Code Refactoring Guide
Priority 1: Break up queries.js (1,786 lines)
Split by domain:
queries.js →
├── queries/payin.queries.js (~500 lines)
├── queries/payout.queries.js (~300 lines)
├── queries/client.queries.js (~400 lines)
├── queries/provider.queries.js (~200 lines — NEW)
├── queries/commission.queries.js (~200 lines — from external scripts)
└── queries/compliance.queries.js (~200 lines — NEW)
Migration strategy: Keep queries.js as a facade that re-exports from sub-modules. This avoids breaking existing imports.
// queries.js (facade — no breaking changes)
const payin = require('./queries/payin.queries');
const payout = require('./queries/payout.queries');
const client = require('./queries/client.queries');
module.exports = {
...payin,
...payout,
...client
};
Priority 2: Extract UI Components from admin-bi.js (2,133 lines)
Identify repeating patterns and extract to src/components/:
// BEFORE (admin-bi.js, inline)
html += `<div class="hero-card">
<div class="hero-label">${label}</div>
<div class="hero-value">${value}</div>
<div class="hero-change ${change > 0 ? 'up' : 'down'}">${change}%</div>
</div>`;
// AFTER (components/kpi-card.js)
function buildKPICard({ label, value, change, color = 'green' }) {
const direction = change > 0 ? 'up' : 'down';
return `<div class="hero-card hero-${color}">
<div class="hero-label">${label}</div>
<div class="hero-value">${value}</div>
<div class="hero-change ${direction}">${change > 0 ? '+' : ''}${change}%</div>
</div>`;
}
module.exports = { buildKPICard };
Priority 3: Separate CSS into Files
Extract inline styles from ui-template.js into CSS files:
public/css/
├── variables.css # CSS custom properties (dark/light theme)
├── layout.css # Grid, nav, page structure
├── components.css # Cards, tables, buttons
├── charts.css # Chart containers, legends
└── responsive.css # Media queries
9. Infrastructure & DevOps
Docker Compose Update
version: '3.8'
services:
bi-ccc:
build: .
container_name: bi-ccc
ports:
- "3080:3080"
environment:
- MYSQL_URL=${MYSQL_URL}
- USER_MYSQL=${USER_MYSQL}
- PW_MYSQL=${PW_MYSQL}
- SESSION_SECRET=${SESSION_SECRET}
- CLICKHOUSE_URL=http://clickhouse:8123
- CLICKHOUSE_DB=bi_analytics
- SLACK_WEBHOOK_URL=${SLACK_WEBHOOK_URL}
- ALERT_EMAIL_TO=${ALERT_EMAIL_TO}
depends_on:
- clickhouse
volumes:
- ./data:/app/data
cap_add:
- NET_ADMIN
- SYS_RESOURCE
clickhouse:
image: clickhouse/clickhouse-server:24.1
container_name: bi-clickhouse
ports:
- "8123:8123"
volumes:
- clickhouse-data:/var/lib/clickhouse
environment:
- CLICKHOUSE_DB=bi_analytics
- CLICKHOUSE_USER=${CLICKHOUSE_USER}
- CLICKHOUSE_PASSWORD=${CLICKHOUSE_PASSWORD}
volumes:
clickhouse-data:
New Dependencies
{
"dependencies": {
"node-cron": "^3.0.3",
"nodemailer": "^6.9.8",
"puppeteer": "^22.0.0",
"exceljs": "^4.4.0",
"@clickhouse/client": "^0.2.10"
}
}
Monitoring
Add basic health check endpoint:
app.get('/health', (req, res) => {
const checks = {
mysql: pool ? 'connected' : 'disconnected',
clickhouse: chPool ? 'connected' : 'disconnected',
uptime: process.uptime(),
memory: process.memoryUsage(),
cache: cache.stats()
};
const healthy = checks.mysql === 'connected';
res.status(healthy ? 200 : 503).json(checks);
});
10. Appendix — SQL Templates
A. Net Revenue Retention (NRR)
WITH monthly_revenue AS (
SELECT
id_conta,
DATE_FORMAT(created_at, '%Y-%m') as month,
SUM(revenue) as mrr
FROM bi_analytics.fact_payin
WHERE status = 'finalizado'
GROUP BY id_conta, DATE_FORMAT(created_at, '%Y-%m')
),
retention AS (
SELECT
a.month as current_month,
SUM(a.mrr) as current_mrr,
SUM(CASE WHEN b.mrr IS NOT NULL THEN LEAST(a.mrr, b.mrr) ELSE 0 END) as retained,
SUM(CASE WHEN b.mrr IS NOT NULL AND a.mrr > b.mrr THEN a.mrr - b.mrr ELSE 0 END) as expansion,
SUM(CASE WHEN b.mrr IS NOT NULL AND a.mrr < b.mrr THEN b.mrr - a.mrr ELSE 0 END) as contraction,
SUM(CASE WHEN b.mrr IS NULL AND a.mrr IS NOT NULL THEN 0 ELSE b.mrr END) as churned
FROM monthly_revenue a
LEFT JOIN monthly_revenue b ON a.id_conta = b.id_conta
AND b.month = DATE_FORMAT(DATE_SUB(STR_TO_DATE(CONCAT(a.month, '-01'), '%Y-%m-%d'), INTERVAL 1 MONTH), '%Y-%m')
GROUP BY a.month
)
SELECT
current_month,
ROUND((retained + expansion - contraction) / NULLIF(retained + contraction, 0) * 100, 1) as nrr_pct
FROM retention
ORDER BY current_month;
B. Revenue Concentration (HHI)
SELECT
ROUND(SUM(POW(client_share, 2)) * 10000) as hhi_index
FROM (
SELECT
id_conta,
SUM(revenue) / (SELECT SUM(revenue) FROM bi_analytics.fact_payin WHERE created_at BETWEEN ? AND ?) as client_share
FROM bi_analytics.fact_payin
WHERE created_at BETWEEN ? AND ?
AND status = 'finalizado'
GROUP BY id_conta
) sub;
-- HHI < 1500 = competitive, 1500-2500 = moderate concentration, > 2500 = high concentration
C. Cross-Product Adoption
SELECT
products_used,
COUNT(*) as client_count,
ROUND(COUNT(*) / SUM(COUNT(*)) OVER() * 100, 1) as pct
FROM (
SELECT
id_conta,
COUNT(DISTINCT product) as products_used
FROM (
SELECT id_conta,
CASE WHEN is_checkout = 1 THEN 'checkout' ELSE 'cambioPay' END as product
FROM bi_analytics.fact_payin
WHERE created_at >= DATE_SUB(NOW(), INTERVAL 90 DAY)
UNION ALL
SELECT id_conta, 'cambioPayReverso' as product
FROM bi_analytics.fact_payout
WHERE created_at >= DATE_SUB(NOW(), INTERVAL 90 DAY)
) all_products
GROUP BY id_conta
) sub
GROUP BY products_used
ORDER BY products_used;
D. Structuring Detection
SELECT
c.nome as client_name,
t.id_conta,
DATE(t.created_at) as tx_date,
COUNT(*) as tx_count,
ROUND(SUM(t.amount_usd), 2) as daily_total,
ROUND(AVG(t.amount_usd), 2) as avg_amount,
ROUND(STDDEV(t.amount_usd), 2) as stddev_amount,
GROUP_CONCAT(ROUND(t.amount_usd, 2) ORDER BY t.created_at) as amounts
FROM br_transaction_to_usa t
JOIN conta c ON c.id_conta = t.id_conta
WHERE t.status = 'finalizado'
AND t.amount_usd BETWEEN 2000 AND 3000
AND t.created_at BETWEEN ? AND ?
GROUP BY t.id_conta, c.nome, DATE(t.created_at)
HAVING tx_count >= 2 AND daily_total > 3000
ORDER BY daily_total DESC;
Checklist de Entregas
Phase 1 (Weeks 1-4)
- Alert engine (
src/alerts/) com 5+ regras configuradas - Slack webhook integrado e testado
- Export PDF/Excel nos dashboards existentes
- Provider Performance dashboard funcional
- Failed Transaction view
- Success Rate + Refund Rate no Corporate Dashboard
queries.jsdividido em módulos (facade mantida)
Phase 2 (Weeks 5-10)
- ClickHouse rodando no Docker Compose
- ETL pipeline sincronizando payin + payout
- Daily aggregates sendo gerados automaticamente
- Client health snapshots diários
- Data quality checks passando
- BI-CCC lendo de ClickHouse (não mais direto do RDS)
Phase 3 (Weeks 11-16)
- Compliance dashboard com threshold monitoring
- Structuring detection funcional
- IOF reconciliation view
- Volume forecast aparecendo no BI Executive
- Churn prediction substituindo Health Score estático
- NRR, LTV, Take Rate no BI Executive
Phase 4 (Weeks 17-24)
- Scheduled reports (daily/weekly/monthly) sendo enviados
- Commission module integrado ao BI-CCC
- Revenue forecast para board reporting
- Frontend migrado para components + htmx
- i18n EN/PT-BR funcional
- API documentada
- Audit logging de acesso a dados
Nota: Este guia assume que o time manterá a stack Node.js/Express. Se houver decisão de migrar para Python (FastAPI) ou Go para o backend de dados, as queries e lógica de negócio se mantêm — muda apenas a camada de serviço.