- Progress bar shows X/255 (or calculated range size)
- Percentage display during scan
- Currently scanning IP shown in real-time
- Live host count as hosts are discovered
- Remaining hosts counter
- Faster polling (2s instead of 5s) for smoother updates
- Backend calculates total hosts from CIDR or range notation
- Parse nmap --stats-every output for progress info
- New Scan History tab shows all past scans
- Each scan shows: tool, target, status, timestamp, findings count
- Rescan button to quickly re-run any previous scan with same settings
- Copy command button copies scan command to clipboard and terminal
- Clear All button to purge scan history
- Relative timestamps (e.g. '5 min ago', '2 hours ago')
- Status badges: running (yellow), completed (green), failed (red)
- Backend endpoints for clearing scan history
- Add distinct 'Local Ollama' and 'Networked Ollama' options in dropdown
- Local uses localhost, Networked uses remote endpoints with load balancing
- Color-coded: Yellow for Local, Blue for Networked
- Icons: Local, Networked
- Backend supports OLLAMA_LOCAL_URL and OLLAMA_NETWORK_URLS env vars
- Updated installer to generate new env var format
- Legacy 'ollama' provider still works for backward compatibility
- Add colored provider badge in header showing current AI (Ollama/OpenAI/Anthropic)
- Display model name under provider icon in header
- Add provider icon and model to each AI message response
- Color-coded badges: yellow for Ollama, green for OpenAI, orange for Anthropic
- Icons: Ollama, OpenAI, Anthropic
- Add Network Map tab with D3.js force-directed graph visualization
- Device icons: Windows, Linux, macOS, routers, switches, printers, servers
- Network range scan modal with ping/quick/OS/full scan options
- Parse nmap XML output to detect OS type and open ports
- Host details sidebar panel with deep scan option
- Draggable network nodes with gateway-centered layout
- Auto-refresh and polling for running scans
- Infer OS from open ports when nmap OS detection unavailable
- Add install.ps1 (PowerShell) and install.sh (Bash) interactive installers
- Support local, networked, and cloud AI providers in installer
- Add multi-Ollama endpoint configuration with high-speed NIC support
- Implement load balancing strategies: round-robin, failover, random
- Update LLM router with endpoint health checking and automatic failover
- Add /endpoints API for monitoring all Ollama instances
- Update docker-compose with OLLAMA_ENDPOINTS and LOAD_BALANCE_STRATEGY
- Rebrand to GooseStrike with custom icon and flag assets