Audit Trigger Watcher

Task Details
← Task Board

Task Description

# TASK: Add Audit Trigger Watcher to Event Bridge

**Priority:** HIGH (re-audit button creates trigger file but nothing watches for it)
**Agent:** engineer
**Filed:** 2026-02-16
**Filed by:** Doug (via Claude Code session)

---

## The Problem

The field audit web UI has a "Re-run Audit" button on the dashboard. When clicked, it creates:

```
farmiq.ai/field-audit/data/customers/{customer_id}/audit_trigger.json
```

Format:
```json
{
  "requested_at": "2026-02-16T14:30:00",
  "customer_id": "brett",
  "reason": "manual_dashboard"
}
```

**But nothing watches for this file.** The event bridge (`AgentPi/scripts/event_bridge.py`) needs a new check that:
1. Scans for `audit_trigger.json` files across all customer directories
2. Enqueues a data_pipeline job to run the audit engine for that customer
3. Deletes or renames the trigger file so it's not re-processed

---

## What Needs to Happen

### Add `check_audit_triggers()` to event_bridge.py

```python
def check_audit_triggers(jq: JobQueue, dry: bool = False) -> int:
    """Scan customer directories for audit_trigger.json files."""
    customers_dir = Path("/data/agentpi/repo/farmiq.ai/field-audit/data/customers")
    # ... or wherever the repo is cloned on the Pi
```

**Logic:**
1. Walk `farmiq.ai/field-audit/data/customers/*/audit_trigger.json`
2. For each trigger file found:
   - Read the JSON (get customer_id, reason, requested_at)
   - Enqueue a `data_pipeline` job with description like: "Run field audit for customer brett (manual request)"
   - Include prompt_context with: customer_id, sources directory path, output directory path
   - Delete or rename trigger file to `audit_trigger.done.json` (prevent re-processing)
3. Register in the main checks list: `("audit_triggers", check_audit_triggers)`

### The data_pipeline agent then runs:

```bash
python FieldNamingAudit/field_naming_audit.py \
    --scan farmiq.ai/field-audit/data/customers/brett/sources/ \
    --output farmiq.ai/field-audit/data/customers/brett/audit_results/audit.json \
    --batch
```

The agent should also check for linked API org directories and include them in the `--scan` paths. The linked orgs are stored in:
```
farmiq.ai/field-audit/data/customers/{customer}/linked_orgs.json
```

Format:
```json
[
  {"platform": "jd", "org_id": "594691", "org_name": "FarmTech, LLC"},
  {"platform": "cnh", "org_id": "ACC000011898", "org_name": "Weist Farms"}
]
```

The actual boundary files are in sibling directories:
- JD: `farmiq.ai/jd-import/data/imports/{org_id}/`
- CNH: `farmiq.ai/cnh-import/data/imports/{org_id}/`

---

## File Locations

- **Event bridge:** `AgentPi/scripts/event_bridge.py`
- **Trigger files:** `farmiq.ai/field-audit/data/customers/*/audit_trigger.json`
- **Linked orgs:** `farmiq.ai/field-audit/data/customers/*/linked_orgs.json`
- **Audit engine:** `FieldNamingAudit/field_naming_audit.py`

---

## Edge Cases

- Multiple customers might trigger audits simultaneously — each gets its own job
- If audit_trigger.json is malformed, skip it and log warning
- If sources directory is empty, still run the audit (it might have linked API orgs)
- The repo path on the Pi might be `/data/agentpi/repo/` or `/home/farmtech/Sandbox/` — check both or use a config constant

---

## DO NOT

- Do NOT run the audit engine directly from event_bridge — just enqueue the job
- Do NOT modify the audit engine itself — that's a separate task
- Do NOT change the trigger file format — the web UI writes it as-is

Job Queue (0)

No job queue entries for this task yet