Skip to content
Moddux · Projects
Investigator Dashboard
Future Product Direction

Behavioral correlation for complex investigative evidence.

A premium case workspace for transforming raw telemetry, profiler outputs, packet captures, and analyst notes into dynamic profile storage, evidence-linked associations, and reviewable investigative intelligence.

Evidence Intake

Case-scoped ingestion for logs, packet captures, wireless exports, SQLite artifacts, screenshots, notes, and structured CSV datasets with retained provenance and parser diagnostics.

Behavioral Analysis

Normalization, temporal segmentation, stale probe detection, recurring network patterns, and candidate-profile generation built for analyst review rather than opaque automation.

Associated Correlation

Cross-signal linkage between MAC, IP, SSID, DNS, HTTP, probe sessions, co-presence windows, and profiler findings with explainable evidence paths.

Reviewable Outcomes

Confidence-scored candidate associations, profile timelines, evidence cards, and analyst actions for merge, split, suppress, and reporting workflows.

Live concept
Case Correlation Console
Analyst mode
Case 25 · Dynamic Profile Store
Active
Profiles
186
Candidate links
412
Evidence files
38
Parser warnings
6
Stale probes
74
Review actions
12
Association lattice
Evidence chain
Raw file
Kismet + Zeek + profiler remarks
Signals
Probe overlap · DNS recurrence · session continuity
Candidate
MAC cluster → profile candidate
Review
Needs analyst confirmation
Platform Summary

The Investigator Dashboard is a case-focused behavioral analysis workspace for ingesting heterogeneous network and device evidence, normalizing it into canonical event structures, deriving profile candidates, and presenting associated correlations through analyst-reviewable evidence chains.

Objective Build Direction

The future product should feel less like a generic admin panel and more like a premium investigative instrument: restrained, high contrast, cinematic, and precise. The interface should communicate evidentiary discipline, technical depth, and controlled workflow progression.

Workflow action

From ingestion to dynamic profile storage

This page should present the product as a disciplined evidence engine. The workflow must be visible, structured, and obviously built around auditable analyst operations.

01
Evidence upload and blob retention
02
Parser routing and staged normalization
03
Feature extraction and temporal bucketing
04
Candidate profile generation
05
Associated correlation and score explanation
06
Analyst review, reporting, and export
Planned visualizations

Visual language for the future interface

Case Timeline
Cross-source chronological replay
Ingest runs
Probe activity
DNS/HTTP events
Profiler findings
Identity Candidates
Confidence-scored associations
MAC/IP continuity
SSID overlap
Temporal recurrence
Conflict flags
Behavioral Signals
Explainable feature layer
Weekday ratio
Destination count
DNS recurrence
Average signal
Analyst Review
Human-in-the-loop controls
Merge/split
Suppress
Evidence drilldown
Export pack
Design spec

Visual direction

Palette: midnight navy, graphite, cold silver, muted violet, restrained steel blue. Remove neon green and bright teal accents.

Tone: premium investigative instrument, cinematic contrast, precise geometry, minimal glow, stronger metallic depth.

Typography: sharper display face for hero and section titles, cleaner technical sans for body and data cards.

Imagery: crisp product renders, evidence-chain diagrams, graph lattice mockups, timeline overlays, profile cards — not placeholder folder icons.

Motion: slow parallax gradients, controlled panel reveals, subtle graph pulse, no arcade-style neon hover effects.

Tool logic on page

Content blocks to implement

Checkpointed evidence pipeline
Hash-addressed raw evidence retention
Zeek, Kismet, CSV, SQLite, and PCAP support
Profiler-backed stale probe and session analysis
Case-scoped dynamic profile storage
Timeline, findings, and graph-ready exports
Transparent evidence contributions per score
Designed for backend-first forensic workflows
Implementation objective

Build the dashboard to make this page true.

Treat this concept page as the product contract: the backend should produce real candidate profiles, evidence chains, reviewable associations, and dynamic timeline storage that can support this exact front-end narrative without exaggeration.

Unify legacy ingestion and cycler provenance
Promote normalization and profiler logic into one case pipeline
Store evidence-linked candidate profiles dynamically
Expose explainable scoring and review actions in UI