瀏覽代碼

docs and tooling: firewall syslog test, dedup command, README updates

- Add scripts/test-firewall-syslog.py: sends FortiGate-style syslog UDP to
  Wazuh port 514 with 10 scenarios; supports --via-docker to preserve
  source IP through Docker NAT
- run-combined-stack.sh: add dedup command (fix missing elif branch so it
  no longer falls through to run_all); add recreate command
- wazuh_manager.conf: add 7 firewall allowed-ips, enable logall/logall_json
- scripts/README.md: document test-firewall-syslog.py, seed-kpi-test-data.py,
  new dashboard NDJSON files
- README.md: full rewrite covering all commands, KPI dashboard, current
  endpoint list, macOS bind-mount note

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
tum 2 天之前
父節點
當前提交
01cddaf9d8
共有 5 個文件被更改,包括 540 次插入49 次删除
  1. 150 47
      README.md
  2. 89 0
      run-combined-stack.sh
  3. 53 0
      scripts/README.md
  4. 244 0
      scripts/test-firewall-syslog.py
  5. 4 2
      wazuh-docker/single-node/config/wazuh_cluster/wazuh_manager.conf

+ 150 - 47
README.md

2
 
2
 
3
 This repository runs a combined SOC lab with:
3
 This repository runs a combined SOC lab with:
4
 
4
 
5
-- `wazuh-docker` (single-node)
6
-- `iris-web`
7
-- `Shuffle`
8
-- `pagerduty-stub`
9
-- `soc-integrator` (FastAPI)
5
+- `wazuh-docker` (single-node) — SIEM, log ingestion, rule engine
6
+- `iris-web` — case and alert management (DFIR-IRIS)
7
+- `Shuffle` — SOAR workflow automation
8
+- `pagerduty-stub` — mock PagerDuty escalation endpoint
9
+- `soc-integrator` (FastAPI) — KPI enrichment, IOC analysis, orchestration
10
+- `flask-openapi-shuffle` — OpenAPI demo for Shuffle integration
10
 
11
 
11
 All services are connected through a shared Docker network (`soc_shared`).
12
 All services are connected through a shared Docker network (`soc_shared`).
12
 
13
 
14
 
15
 
15
 - Docker + Docker Compose plugin
16
 - Docker + Docker Compose plugin
16
 - Bash
17
 - Bash
17
-- `nc` (for test event script)
18
+- Python 3 (for test/seed scripts)
18
 
19
 
19
 ## Quick Start
20
 ## Quick Start
20
 
21
 
42
 ./run-combined-stack.sh status
43
 ./run-combined-stack.sh status
43
 ```
44
 ```
44
 
45
 
46
+## Stack Management
47
+
48
+```bash
49
+./run-combined-stack.sh <command> [target] [options]
50
+```
51
+
52
+| Command | Description |
53
+|---|---|
54
+| `up [target] [-d]` | Start services (default: all, detached) |
55
+| `down <target>` | Stop services (target required) |
56
+| `recreate [target]` | Force-recreate containers (picks up bind-mount changes) |
57
+| `logs [target] [-f]` | View logs |
58
+| `status` | Show container and endpoint status |
59
+| `dedup` | Remove duplicate OpenSearch index patterns in Wazuh dashboard |
60
+| `cleanup [--with-volumes]` | Prune stopped containers, unused images, builder cache |
61
+| `help` | Show full usage |
62
+
63
+Targets: `all` / `--all`, `wazuh`, `iris`, `shuffle`, `pagerduty`, `integrator`, `flask-openapi-shuffle`
64
+
65
+Examples:
66
+
67
+```bash
68
+./run-combined-stack.sh up iris -d
69
+./run-combined-stack.sh recreate wazuh
70
+./run-combined-stack.sh recreate --all
71
+./run-combined-stack.sh down shuffle
72
+./run-combined-stack.sh logs integrator -f
73
+./run-combined-stack.sh dedup
74
+./run-combined-stack.sh cleanup --with-volumes
75
+```
76
+
77
+> **macOS bind mount note**: After editing config files on the host, run `recreate` to ensure the
78
+> container picks up the new inode. The `Edit`/`Write` tools create new inodes on macOS which Docker
79
+> does not automatically detect.
80
+
45
 ## Service URLs
81
 ## Service URLs
46
 
82
 
47
-- Wazuh Dashboard: `https://localhost`
48
-- Wazuh API: `https://localhost:55000`
49
-- IRIS-web: `https://localhost:8443`
50
-- Shuffle UI: `http://localhost:3001`
51
-- PagerDuty Stub: `http://localhost:18080`
52
-- SOC Integrator API: `http://localhost:8088`
53
-- SOC Integrator Swagger: `http://localhost:8088/docs`
83
+| Service | URL |
84
+|---|---|
85
+| Wazuh Dashboard | `https://localhost` |
86
+| Wazuh API | `https://localhost:55000` |
87
+| IRIS-web | `https://localhost:8443` |
88
+| IRIS KPI Dashboard | `https://localhost:8443/kpi-dashboard` |
89
+| Shuffle UI | `http://localhost:3001` |
90
+| PagerDuty Stub | `http://localhost:18080` |
91
+| SOC Integrator API | `http://localhost:8088` |
92
+| SOC Integrator Swagger | `http://localhost:8088/docs` |
54
 
93
 
55
 ## SOC Integrator
94
 ## SOC Integrator
56
 
95
 
57
-Key env file:
96
+Key env file: `soc-integrator/.env`
97
+
98
+### IRIS KPI endpoints
58
 
99
 
59
-- `soc-integrator/.env`
100
+These endpoints fetch IRIS data and enrich each record with live SLA/KPI metrics:
101
+
102
+```
103
+GET  /iris/alerts                 List alerts with KPI
104
+GET  /iris/alerts/{id}            Single alert with KPI
105
+POST /iris/alerts/{id}/assign     Assign alert
106
+GET  /iris/alerts/export-csv      Export alerts as CSV
107
+GET  /iris/cases                  List cases with KPI
108
+GET  /iris/cases/{id}             Single case with KPI
109
+GET  /iris/cases/export-csv       Export cases as CSV
110
+```
60
 
111
 
61
-Main sections:
112
+KPI is computed per alert/case:
62
 
113
 
63
-- Legacy integration APIs (`/wazuh/*`, `/shuffle/*`, `/action/*`)
64
-- MVP orchestration APIs (`/mvp/*`)
65
-- Wazuh-to-MVP sync API (`/wazuh/sync-to-mvp`)
66
-- Wazuh auto-sync status API (`/wazuh/auto-sync/status`)
114
+```
115
+elapsed_pct = (now − created_at) / sla_seconds × 100
116
+kpi_pct     = 100 − elapsed_pct  (clamped 0–100)
67
 
117
 
68
-### MVP endpoints
118
+SLA by severity:   High → 4 h   Medium → 8 h   Low → 24 h
69
 
119
 
70
-- `POST /mvp/incidents/ingest`
71
-- `POST /mvp/ioc/evaluate`
72
-- `POST /mvp/vpn/evaluate`
73
-- `GET /mvp/config/policies`
74
-- `PUT /mvp/config/policies`
75
-- `GET /mvp/health/dependencies`
120
+Status thresholds: On Track ≥ 80 | Watch ≥ 60 | Warning ≥ 40 | Urgent ≥ 20 | Critical > 0 | Breached
121
+Resolved alerts:   elapsed frozen at resolution time, status = "Resolved"
122
+```
76
 
123
 
77
-Protected endpoints require:
124
+### MVP orchestration endpoints
78
 
125
 
79
-- Header: `X-Internal-API-Key`
80
-- Key from: `SOC_INTEGRATOR_INTERNAL_KEY` in `soc-integrator/.env`
126
+```
127
+POST /mvp/incidents/ingest
128
+POST /mvp/ioc/evaluate
129
+POST /mvp/vpn/evaluate
130
+GET  /mvp/config/policies
131
+PUT  /mvp/config/policies
132
+GET  /mvp/health/dependencies
133
+```
134
+
135
+Protected endpoints require header: `X-Internal-API-Key`
136
+Key from: `SOC_INTEGRATOR_INTERNAL_KEY` in `soc-integrator/.env`
137
+
138
+### Other endpoints
139
+
140
+```
141
+GET  /health
142
+GET  /wazuh/alerts
143
+GET  /wazuh/agents
144
+POST /wazuh/sync-to-mvp
145
+GET  /wazuh/auto-sync/status
146
+POST /ingest/wazuh-alert
147
+GET  /ioc/enrich
148
+POST /ioc/evaluate
149
+GET  /geoip/{ip}
150
+POST /action/create-incident
151
+POST /action/create-iris-case
152
+POST /action/trigger-shuffle
153
+GET/POST /shuffle/workflows
154
+GET  /sim/logs/runs
155
+POST /sim/logs/start
156
+```
81
 
157
 
82
 ### Example: MVP ingest
158
 ### Example: MVP ingest
83
 
159
 
102
   }'
178
   }'
103
 ```
179
 ```
104
 
180
 
105
-## Test Events to Wazuh
181
+## Sending Test Events to Wazuh
106
 
182
 
107
-Send synthetic events via syslog UDP 514:
183
+### Appendix A/B/C simulation logs
184
+
185
+Replay production-style sample logs via syslog UDP 514:
108
 
186
 
109
 ```bash
187
 ```bash
110
-scripts/send-wazuh-test-events.sh all
188
+scripts/send-wazuh-sim-logs.sh all 1 0.2
189
+scripts/send-wazuh-sim-logs.sh a2 1 0
190
+scripts/send-wazuh-sim-logs.sh B3-06 1 0
191
+scripts/send-wazuh-sim-logs.sh all 1 0 --dry-run
111
 ```
192
 ```
112
 
193
 
113
-Scenarios:
194
+See `scripts/README.md` for full selector/flag reference.
195
+
196
+### FortiGate firewall syslog test
197
+
198
+Send FortiGate-style syslog messages directly to Wazuh port 514/UDP:
114
 
199
 
115
-- `ioc_dns`
116
-- `ioc_ips`
117
-- `vpn_outside_th`
118
-- `windows_auth_fail`
119
-- `all`
200
+```bash
201
+python3 scripts/test-firewall-syslog.py --via-docker
202
+python3 scripts/test-firewall-syslog.py --scenario rdp --via-docker
203
+```
120
 
204
 
121
-See `scripts/README.md` for details.
205
+The `--via-docker` flag sends from inside the container to preserve the firewall source IP
206
+through Docker NAT. Source IP must be in the `allowed-ips` list in `wazuh_manager.conf`.
122
 
207
 
123
-Sync Wazuh alerts from indexer into MVP pipeline:
208
+### Sync Wazuh alerts into MVP pipeline
124
 
209
 
125
 ```bash
210
 ```bash
126
 curl -X POST "http://localhost:8088/wazuh/sync-to-mvp?limit=50&minutes=120&q=*" \
211
 curl -X POST "http://localhost:8088/wazuh/sync-to-mvp?limit=50&minutes=120&q=*" \
129
 
214
 
130
 Notes:
215
 Notes:
131
 
216
 
132
-- This sync reads from `wazuh-alerts-*` in Wazuh indexer.
133
-- Re-running sync is safe; dedupe is applied by `source + event_id`.
134
-- Your `send-wazuh-test-events.sh` traffic appears only after Wazuh rules generate alerts.
217
+- Reads from `wazuh-alerts-*` in Wazuh indexer.
218
+- Re-running is safe — dedupe applied by `source + event_id`.
219
+- Wazuh must fire rules before alerts appear (check `archives.log` first).
135
 
220
 
136
-Enable automatic sync worker:
221
+### Enable automatic sync worker
137
 
222
 
138
 ```bash
223
 ```bash
139
 sed -i 's/^WAZUH_AUTO_SYNC_ENABLED=.*/WAZUH_AUTO_SYNC_ENABLED=true/' soc-integrator/.env
224
 sed -i 's/^WAZUH_AUTO_SYNC_ENABLED=.*/WAZUH_AUTO_SYNC_ENABLED=true/' soc-integrator/.env
140
 ./run-combined-stack.sh up integrator --build -d
225
 ./run-combined-stack.sh up integrator --build -d
141
-./run-combined-stack.sh logs integrator -f
142
 ```
226
 ```
143
 
227
 
144
 Auto-sync settings in `soc-integrator/.env`:
228
 Auto-sync settings in `soc-integrator/.env`:
149
 - `WAZUH_AUTO_SYNC_LIMIT` (default `50`)
233
 - `WAZUH_AUTO_SYNC_LIMIT` (default `50`)
150
 - `WAZUH_AUTO_SYNC_MINUTES` (default `120`)
234
 - `WAZUH_AUTO_SYNC_MINUTES` (default `120`)
151
 
235
 
236
+## KPI Dashboard
237
+
238
+The KPI dashboard is embedded inside IRIS at `/kpi-dashboard`.
239
+
240
+It shows alerts and cases with live SLA progress (colour-coded gauge), status, owner, and severity.
241
+The page auto-refreshes every 60 seconds.
242
+
243
+To seed test data covering every KPI state:
244
+
245
+```bash
246
+IRIS_API_KEY=<key> python3 scripts/seed-kpi-test-data.py
247
+```
248
+
249
+Find your API key in IRIS → My Profile.
250
+
152
 ## Logs
251
 ## Logs
153
 
252
 
154
 All logs (non-follow):
253
 All logs (non-follow):
167
 ## Notes
266
 ## Notes
168
 
267
 
169
 - MVP escalation is wired to `pagerduty-stub` (not real PagerDuty).
268
 - MVP escalation is wired to `pagerduty-stub` (not real PagerDuty).
170
-- IRIS-web is used as case management backend (replacing DFIRTrack).
269
+- IRIS-web is used as case management backend.
270
+- After `recreate wazuh`, run `./run-combined-stack.sh dedup` to remove duplicate index patterns
271
+  created by the Wazuh dashboard post-init script.
272
+- Wazuh archive logging requires `<logall>yes</logall>` in `wazuh_manager.conf`
273
+  (already enabled); archives appear in `/var/ossec/logs/archives/archives.log`.

+ 89 - 0
run-combined-stack.sh

26
 
26
 
27
 Commands:
27
 Commands:
28
   up           Start services (default: up -d when no args)
28
   up           Start services (default: up -d when no args)
29
+  recreate     Force-recreate containers (picks up bind-mount inode changes)
30
+  dedup        Remove duplicate OpenSearch index patterns in Wazuh dashboard
29
   down         Stop services (requires explicit target)
31
   down         Stop services (requires explicit target)
30
   logs         View logs
32
   logs         View logs
31
   status       Show container and endpoint status
33
   status       Show container and endpoint status
46
   ./run-combined-stack.sh up --all -d
48
   ./run-combined-stack.sh up --all -d
47
   ./run-combined-stack.sh up iris -d
49
   ./run-combined-stack.sh up iris -d
48
   ./run-combined-stack.sh up flask-openapi-shuffle -d
50
   ./run-combined-stack.sh up flask-openapi-shuffle -d
51
+  ./run-combined-stack.sh recreate wazuh
52
+  ./run-combined-stack.sh recreate --all
53
+  ./run-combined-stack.sh dedup
49
   ./run-combined-stack.sh down shuffle
54
   ./run-combined-stack.sh down shuffle
50
   ./run-combined-stack.sh down --all
55
   ./run-combined-stack.sh down --all
51
   ./run-combined-stack.sh logs integrator -f
56
   ./run-combined-stack.sh logs integrator -f
67
   exec "${ROOT_DIR}/soc-status.sh"
72
   exec "${ROOT_DIR}/soc-status.sh"
68
 fi
73
 fi
69
 
74
 
75
+dedup_index_patterns() {
76
+  local dashboard_url="https://localhost:443"
77
+  local user="kibanaserver"
78
+  local pass="kibanaserver"
79
+  local max_wait=60
80
+  local waited=0
81
+
82
+  echo "Waiting for Wazuh dashboard to be ready..."
83
+  until curl -sk -u "${user}:${pass}" "${dashboard_url}/api/status" -o /dev/null 2>&1; do
84
+    sleep 3
85
+    waited=$((waited + 3))
86
+    if [[ ${waited} -ge ${max_wait} ]]; then
87
+      echo "Dashboard not ready after ${max_wait}s — skipping dedup."
88
+      return 1
89
+    fi
90
+  done
91
+
92
+  echo "Scanning for duplicate index patterns..."
93
+  python3 - <<PYEOF
94
+import json, sys, urllib.request, urllib.error, ssl
95
+
96
+BASE = "${dashboard_url}"
97
+AUTH = ("${user}", "${pass}")
98
+ctx  = ssl.create_default_context(); ctx.check_hostname = False; ctx.verify_mode = ssl.CERT_NONE
99
+
100
+def req(method, path, data=None):
101
+    import base64
102
+    token = base64.b64encode(f"{AUTH[0]}:{AUTH[1]}".encode()).decode()
103
+    headers = {"osd-xsrf": "true", "Authorization": f"Basic {token}"}
104
+    if data:
105
+        headers["Content-Type"] = "application/json"
106
+    r = urllib.request.Request(BASE + path, data=data, headers=headers, method=method)
107
+    with urllib.request.urlopen(r, context=ctx, timeout=15) as resp:
108
+        return json.loads(resp.read())
109
+
110
+# Fetch all index-pattern saved objects
111
+result = req("GET", "/api/saved_objects/_find?type=index-pattern&per_page=100")
112
+patterns = result.get("saved_objects", [])
113
+
114
+# Group by title
115
+from collections import defaultdict
116
+by_title = defaultdict(list)
117
+for p in patterns:
118
+    by_title[p["attributes"]["title"]].append(p)
119
+
120
+deleted = 0
121
+for title, objs in by_title.items():
122
+    if len(objs) <= 1:
123
+        continue
124
+    # Keep the one whose ID matches the title (canonical), or the oldest updated_at
125
+    canonical = next((o for o in objs if o["id"] == title), None)
126
+    if not canonical:
127
+        canonical = sorted(objs, key=lambda o: o.get("updated_at", ""))[0]
128
+    to_delete = [o for o in objs if o["id"] != canonical["id"]]
129
+    for obj in to_delete:
130
+        try:
131
+            req("DELETE", f"/api/saved_objects/index-pattern/{obj['id']}")
132
+            print(f"  deleted  [{obj['id']}]  title='{title}'")
133
+            deleted += 1
134
+        except urllib.error.HTTPError as e:
135
+            print(f"  error deleting [{obj['id']}]: {e}")
136
+
137
+if deleted == 0:
138
+    print("  no duplicates found.")
139
+else:
140
+    print(f"  removed {deleted} duplicate(s).")
141
+PYEOF
142
+}
143
+
70
 run_cleanup() {
144
 run_cleanup() {
71
   local with_volumes="${1:-false}"
145
   local with_volumes="${1:-false}"
72
 
146
 
314
       run_all "up"
388
       run_all "up"
315
       ;;
389
       ;;
316
   esac
390
   esac
391
+elif [[ "${COMMAND}" == "recreate" ]]; then
392
+  TARGET="${1:-all}"
393
+  COMMAND="up"
394
+  case "${TARGET}" in
395
+    wazuh|iris|shuffle|pagerduty|integrator|flask-openapi-shuffle)
396
+      ARGS=("--force-recreate" "-d")
397
+      run_target "${TARGET}"
398
+      ;;
399
+    all|--all|*)
400
+      ARGS=("--force-recreate" "-d")
401
+      run_all "up"
402
+      ;;
403
+  esac
317
 elif [[ "${COMMAND}" == "cleanup" ]]; then
404
 elif [[ "${COMMAND}" == "cleanup" ]]; then
318
   WITH_VOLUMES="false"
405
   WITH_VOLUMES="false"
319
   for arg in ${ARGS[@]+"${ARGS[@]}"}; do
406
   for arg in ${ARGS[@]+"${ARGS[@]}"}; do
326
     esac
413
     esac
327
   done
414
   done
328
   run_cleanup "${WITH_VOLUMES}"
415
   run_cleanup "${WITH_VOLUMES}"
416
+elif [[ "${COMMAND}" == "dedup" ]]; then
417
+  dedup_index_patterns
329
 else
418
 else
330
   run_all "up"
419
   run_all "up"
331
 fi
420
 fi

+ 53 - 0
scripts/README.md

37
 - `samples/appendix-b-production-samples.log`
37
 - `samples/appendix-b-production-samples.log`
38
 - `samples/appendix-c-production-samples.log`
38
 - `samples/appendix-c-production-samples.log`
39
 
39
 
40
+## Firewall syslog test
41
+
42
+Send FortiGate-style syslog messages to Wazuh manager port 514/UDP to test firewall log ingestion.
43
+
44
+```bash
45
+python3 scripts/test-firewall-syslog.py [--host HOST] [--port PORT] [--src-ip IP] [--scenario SCENARIO]
46
+python3 scripts/test-firewall-syslog.py --via-docker   # send from inside container (avoids NAT)
47
+```
48
+
49
+Examples:
50
+
51
+```bash
52
+python3 scripts/test-firewall-syslog.py                         # send all scenarios from localhost
53
+python3 scripts/test-firewall-syslog.py --via-docker            # recommended: avoids Docker NAT source-IP rewrite
54
+python3 scripts/test-firewall-syslog.py --scenario rdp
55
+python3 scripts/test-firewall-syslog.py --scenario all --delay 0.5 --repeat 3
56
+python3 scripts/test-firewall-syslog.py --host 192.168.1.10 --src-ip 172.16.22.253
57
+```
58
+
59
+Available scenarios: `rdp`, `password_change`, `create_admin`, `disable_alert`, `download_config`,
60
+`ips_critical`, `port_scan`, `ioc_ip`, `traffic_allow`, `traffic_deny`, `all`
61
+
62
+Arguments:
63
+
64
+- `--host` — Wazuh manager host (default `127.0.0.1`)
65
+- `--port` — Syslog UDP port (default `514`)
66
+- `--src-ip` — Simulated firewall source IP, must be in `allowed-ips` list (default `172.16.22.253`)
67
+- `--delay` — Delay between messages in seconds (default `0.2`)
68
+- `--repeat` — Number of times to repeat each scenario (default `1`)
69
+- `--via-docker` — Execute inside the Wazuh container to preserve source IP through Docker NAT
70
+
71
+Verify receipt:
72
+
73
+```bash
74
+docker exec wazuh-single-wazuh.manager-1 tail -f /var/ossec/logs/archives/archives.log | grep 172.16.22.253
75
+```
76
+
40
 ## Dashboard import
77
 ## Dashboard import
41
 
78
 
42
 Import Wazuh dashboards (NDJSON):
79
 Import Wazuh dashboards (NDJSON):
52
 scripts/import-wazuh-dashboard.sh scripts/events/wazuh-proposal-appendix-ab-dashboard.ndjson
89
 scripts/import-wazuh-dashboard.sh scripts/events/wazuh-proposal-appendix-ab-dashboard.ndjson
53
 scripts/import-wazuh-dashboard.sh scripts/events/wazuh-proposal-appendix-c-dashboard.ndjson
90
 scripts/import-wazuh-dashboard.sh scripts/events/wazuh-proposal-appendix-c-dashboard.ndjson
54
 scripts/import-wazuh-dashboard.sh scripts/events/wazuh-client-agents-dashboard.ndjson
91
 scripts/import-wazuh-dashboard.sh scripts/events/wazuh-client-agents-dashboard.ndjson
92
+scripts/import-wazuh-dashboard.sh scripts/events/wazuh-fortigate-sim-dashboard.ndjson
93
+scripts/import-wazuh-dashboard.sh scripts/events/wazuh-proposal-custom-rules-dashboard.ndjson
94
+```
95
+
96
+## KPI test data seeder
97
+
98
+Create IRIS alerts and cases covering every KPI state for UI testing.
99
+
100
+```bash
101
+python3 scripts/seed-kpi-test-data.py [--alerts-only] [--cases-only] [--dry-run]
55
 ```
102
 ```
56
 
103
 
104
+Environment variables:
105
+
106
+- `IRIS_BASE_URL` — default `https://localhost:8443`
107
+- `IRIS_API_KEY` — required (find in IRIS → My Profile → API key)
108
+
57
 ## Other helpers
109
 ## Other helpers
58
 
110
 
59
 - `seed-iris-demo-data.sh`: seed IRIS demo cases/tasks via API.
111
 - `seed-iris-demo-data.sh`: seed IRIS demo cases/tasks via API.
65
 
117
 
66
 - Legacy `send-wazuh-*` simulator scripts were removed and replaced by `send-wazuh-sim-logs.sh`.
118
 - Legacy `send-wazuh-*` simulator scripts were removed and replaced by `send-wazuh-sim-logs.sh`.
67
 - If you add new sample events, keep comments tagged with use-case IDs (for example `# A2-01 ...`) so selector filtering keeps working.
119
 - If you add new sample events, keep comments tagged with use-case IDs (for example `# A2-01 ...`) so selector filtering keeps working.
120
+- Wazuh must have `<logall>yes</logall>` set in `wazuh_manager.conf` for archives.log to be populated.

+ 244 - 0
scripts/test-firewall-syslog.py

1
+#!/usr/bin/env python3
2
+"""
3
+test-firewall-syslog.py — Send FortiGate-style syslog test messages to Wazuh manager port 514/UDP.
4
+
5
+Usage:
6
+  python3 scripts/test-firewall-syslog.py [--host HOST] [--port PORT] [--src-ip IP] [--scenario SCENARIO]
7
+  python3 scripts/test-firewall-syslog.py --via-docker          # send from inside container (avoids NAT)
8
+
9
+Examples:
10
+  python3 scripts/test-firewall-syslog.py
11
+  python3 scripts/test-firewall-syslog.py --via-docker
12
+  python3 scripts/test-firewall-syslog.py --host 192.168.1.10 --src-ip 172.16.22.253
13
+  python3 scripts/test-firewall-syslog.py --scenario rdp
14
+  python3 scripts/test-firewall-syslog.py --scenario all --delay 0.5
15
+"""
16
+
17
+import argparse
18
+import socket
19
+import subprocess
20
+import time
21
+import datetime
22
+
23
+# ── Default target ────────────────────────────────────────────────────────────
24
+DEFAULT_HOST = "127.0.0.1"
25
+DEFAULT_PORT = 514
26
+DEFAULT_SRC_IP = "172.16.22.253"   # must be in allowed-ips list
27
+WAZUH_CONTAINER = "wazuh-single-wazuh.manager-1"
28
+
29
+# ── FortiGate syslog scenarios ────────────────────────────────────────────────
30
+def _ts():
31
+    now = datetime.datetime.now(datetime.timezone.utc)
32
+    return now.strftime("%Y-%m-%d"), now.strftime("%H:%M:%S")
33
+
34
+def make_fortigate_log(src_ip, **fields):
35
+    """Build a FortiGate v6 syslog line (key=value pairs)."""
36
+    date, time_ = _ts()
37
+    base = {
38
+        "date": date,
39
+        "time": time_,
40
+        "devname": "FG-TEST-FW",
41
+        "devid": "FGT60E0000000001",
42
+        "logid": "0000000013",
43
+        "type": "traffic",
44
+        "subtype": "forward",
45
+        "level": "notice",
46
+        "vd": "root",
47
+        "srcip": src_ip,
48
+        "srcport": "54321",
49
+        "srcintf": "port1",
50
+        "dstip": "10.0.0.1",
51
+        "dstport": "80",
52
+        "dstintf": "port2",
53
+        "policyid": "1",
54
+        "proto": "6",
55
+        "action": "accept",
56
+        "service": "HTTP",
57
+        "duration": "1",
58
+        "sentbyte": "1024",
59
+        "rcvdbyte": "2048",
60
+    }
61
+    base.update(fields)
62
+    parts = []
63
+    for k, v in base.items():
64
+        if " " in str(v) or "=" in str(v):
65
+            parts.append(f'{k}="{v}"')
66
+        else:
67
+            parts.append(f"{k}={v}")
68
+    return " ".join(parts)
69
+
70
+SCENARIOS = {
71
+    # A2-01: RDP traffic allowed
72
+    "rdp": {
73
+        "description": "A2-01 — RDP (3389) traffic allowed",
74
+        "log": lambda src: make_fortigate_log(src,
75
+            logid="0000000013", type="traffic", subtype="forward",
76
+            dstport="3389", action="accept", service="RDP",
77
+        ),
78
+    },
79
+    # A2-02: Admin password change
80
+    "password_change": {
81
+        "description": "A2-02 — Admin password changed",
82
+        "log": lambda src: make_fortigate_log(src,
83
+            logid="0100032001", type="event", subtype="system",
84
+            level="information", action="password-change",
85
+            user="admin", msg="Change admin password",
86
+        ),
87
+    },
88
+    # A2-03: New admin account created
89
+    "create_admin": {
90
+        "description": "A2-03 — New admin account created",
91
+        "log": lambda src: make_fortigate_log(src,
92
+            logid="0100032002", type="event", subtype="system",
93
+            level="information", action="create-admin",
94
+            user="newadmin", msg="Create admin account",
95
+        ),
96
+    },
97
+    # A2-04: Alerting disabled via config change
98
+    "disable_alert": {
99
+        "description": "A2-04 — Alerting disabled (config-change)",
100
+        "log": lambda src: make_fortigate_log(src,
101
+            logid="0100032003", type="event", subtype="system",
102
+            level="warning", action="config-change",
103
+            config_value="disable", msg="Email alerting disabled",
104
+        ),
105
+    },
106
+    # A2-05: Config file downloaded
107
+    "download_config": {
108
+        "description": "A2-05 — Firewall config downloaded",
109
+        "log": lambda src: make_fortigate_log(src,
110
+            logid="0100032004", type="event", subtype="system",
111
+            level="warning", action="download-config",
112
+            user="admin", msg="Configuration backup downloaded",
113
+        ),
114
+    },
115
+    # A2-06: Multiple critical IPS signatures
116
+    "ips_critical": {
117
+        "description": "A2-06 — Multiple critical IPS signatures",
118
+        "log": lambda src: make_fortigate_log(src,
119
+            logid="0419016384", type="utm", subtype="ips",
120
+            level="critical", action="dropped",
121
+            attack="Multiple.Critical.Signatures", severity="critical",
122
+            srcip="203.0.113.42", dstip="172.16.1.10",
123
+        ),
124
+    },
125
+    # A2-07: TCP port scan
126
+    "port_scan": {
127
+        "description": "A2-07 — TCP port scan detected",
128
+        "log": lambda src: make_fortigate_log(src,
129
+            logid="0419016385", type="utm", subtype="anomaly",
130
+            level="critical", action="dropped",
131
+            attack="TCP.Port.Scan", severity="critical",
132
+            srcip="203.0.113.99", dstip="172.16.0.0",
133
+        ),
134
+    },
135
+    # A2-08: IPS IOC-based IP indicator
136
+    "ioc_ip": {
137
+        "description": "A2-08 — IPS IOC-based IP indicator",
138
+        "log": lambda src: make_fortigate_log(src,
139
+            logid="0419016386", type="utm", subtype="ips",
140
+            level="critical", action="blocked",
141
+            ioc_type="ip", ioc_value="198.51.100.42",
142
+            srcip="198.51.100.42", dstip="172.16.1.5",
143
+        ),
144
+    },
145
+    # Generic traffic allow
146
+    "traffic_allow": {
147
+        "description": "Generic — Traffic allowed (HTTP)",
148
+        "log": lambda src: make_fortigate_log(src,
149
+            dstport="80", action="accept", service="HTTP",
150
+        ),
151
+    },
152
+    # Generic traffic deny
153
+    "traffic_deny": {
154
+        "description": "Generic — Traffic denied",
155
+        "log": lambda src: make_fortigate_log(src,
156
+            dstport="22", action="deny", service="SSH",
157
+        ),
158
+    },
159
+}
160
+
161
+
162
+def _build_syslog_packet(message, src_ip, facility=16, severity=6):
163
+    pri = (facility * 8) + severity
164
+    _, time_ = _ts()
165
+    month_day = datetime.datetime.now(datetime.timezone.utc).strftime("%b %d").replace(" 0", "  ")
166
+    return f"<{pri}>{month_day} {time_} {src_ip} {message}"
167
+
168
+
169
+def send_syslog_udp(host, port, message, src_ip):
170
+    """Send directly via UDP socket (source IP will appear as Docker bridge on local test)."""
171
+    packet = _build_syslog_packet(message, src_ip)
172
+    sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
173
+    try:
174
+        sock.sendto(packet.encode("utf-8"), (host, port))
175
+        return True
176
+    except OSError:
177
+        return False
178
+    finally:
179
+        sock.close()
180
+
181
+
182
+def send_syslog_via_docker(message, src_ip, port):
183
+    """Send UDP from inside the Wazuh container using /dev/udp — source IP is container's own IP."""
184
+    packet = _build_syslog_packet(message, src_ip)
185
+    py_cmd = (
186
+        f"import socket; s=socket.socket(socket.AF_INET,socket.SOCK_DGRAM); "
187
+        f"s.sendto({repr(packet.encode())}, ('127.0.0.1',{port})); s.close()"
188
+    )
189
+    result = subprocess.run(
190
+        ["docker", "exec", WAZUH_CONTAINER, "python3", "-c", py_cmd],
191
+        capture_output=True, timeout=5,
192
+    )
193
+    return result.returncode == 0
194
+
195
+
196
+def run_scenario(name, info, host, port, src_ip, via_docker):
197
+    log = info["log"](src_ip)
198
+    if via_docker:
199
+        ok = send_syslog_via_docker(log, src_ip, port)
200
+    else:
201
+        ok = send_syslog_udp(host, port, log, src_ip)
202
+    status = "✓" if ok else "✗"
203
+    print(f"  {status}  {name:20s}  {info['description']}")
204
+    return ok
205
+
206
+
207
+def main():
208
+    parser = argparse.ArgumentParser(description="Send FortiGate syslog test messages to Wazuh")
209
+    parser.add_argument("--host",       default=DEFAULT_HOST, help=f"Wazuh manager host (default: {DEFAULT_HOST})")
210
+    parser.add_argument("--port",       default=DEFAULT_PORT, type=int, help=f"Syslog UDP port (default: {DEFAULT_PORT})")
211
+    parser.add_argument("--src-ip",     default=DEFAULT_SRC_IP, help=f"Simulated firewall source IP (default: {DEFAULT_SRC_IP})")
212
+    parser.add_argument("--scenario",   default="all",
213
+                        choices=list(SCENARIOS.keys()) + ["all"],
214
+                        help="Scenario to send (default: all)")
215
+    parser.add_argument("--delay",      default=0.2, type=float, help="Delay between messages in seconds (default: 0.2)")
216
+    parser.add_argument("--repeat",     default=1, type=int, help="Number of times to repeat each scenario (default: 1)")
217
+    parser.add_argument("--via-docker", action="store_true",
218
+                        help="Send from inside the Wazuh container (avoids Docker NAT source-IP rewrite)")
219
+    args = parser.parse_args()
220
+
221
+    mode = f"docker exec {WAZUH_CONTAINER}" if args.via_docker else f"{args.host}:{args.port}"
222
+    print(f"Wazuh syslog test — mode: {mode}  src-ip: {args.src_ip}")
223
+    print(f"{'─' * 65}")
224
+
225
+    to_run = list(SCENARIOS.items()) if args.scenario == "all" else [(args.scenario, SCENARIOS[args.scenario])]
226
+    total = ok_count = 0
227
+
228
+    for _ in range(args.repeat):
229
+        for name, info in to_run:
230
+            success = run_scenario(name, info, args.host, args.port, args.src_ip, args.via_docker)
231
+            total += 1
232
+            ok_count += int(success)
233
+            if args.delay:
234
+                time.sleep(args.delay)
235
+
236
+    print(f"{'─' * 65}")
237
+    print(f"Sent {ok_count}/{total} messages")
238
+    print()
239
+    print("Verify with:")
240
+    print(f"  docker exec {WAZUH_CONTAINER} tail -f /var/ossec/logs/archives/archives.log | grep {args.src_ip}")
241
+
242
+
243
+if __name__ == "__main__":
244
+    main()

+ 4 - 2
wazuh-docker/single-node/config/wazuh_cluster/wazuh_manager.conf

2
   <global>
2
   <global>
3
     <jsonout_output>yes</jsonout_output>
3
     <jsonout_output>yes</jsonout_output>
4
     <alerts_log>yes</alerts_log>
4
     <alerts_log>yes</alerts_log>
5
-    <logall>no</logall>
6
-    <logall_json>no</logall_json>
5
+    <logall>yes</logall>
6
+    <logall_json>yes</logall_json>
7
     <email_notification>no</email_notification>
7
     <email_notification>no</email_notification>
8
     <smtp_server>smtp.example.wazuh.com</smtp_server>
8
     <smtp_server>smtp.example.wazuh.com</smtp_server>
9
     <email_from>wazuh@example.wazuh.com</email_from>
9
     <email_from>wazuh@example.wazuh.com</email_from>
41
     <allowed-ips>172.16.162.1/24</allowed-ips>
41
     <allowed-ips>172.16.162.1/24</allowed-ips>
42
     <allowed-ips>172.16.160.253/24</allowed-ips>
42
     <allowed-ips>172.16.160.253/24</allowed-ips>
43
     <allowed-ips>172.16.165.254/24</allowed-ips>
43
     <allowed-ips>172.16.165.254/24</allowed-ips>
44
+    <allowed-ips>172.19.0.0/16</allowed-ips>   <!-- Docker bridge — local test only -->
45
+    <allowed-ips>127.0.0.1/32</allowed-ips>   <!-- loopback — local test only -->
44
   </remote>
46
   </remote>
45
 
47
 
46
   <!-- Policy monitoring -->
48
   <!-- Policy monitoring -->