top of page

Designing network analysis and coordination detection

 

The quick version

I led and designed the network analysis and coordination detection experience, helping analysts identify and investigate coordinated behaviour across social media platforms. The core design challenge: how do you turn millions of data points into a clear, interpretable visualisation that analysts can build, explore, refine, and communicate to stakeholders?​

My role: Product Design Lead. Led the design team through v1; sole designer for the platform redesign. Core team: Data science, OSINT analysts, engineering, and UX researchers (v1). Tools: Figma, Claude Code (prototyping).

 

 

Why this project exists

Detecting coordinated behaviour online requires mapping relationships between thousands of accounts across platforms. Analysts were doing this manually with disconnected external tools: slow, error-prone, and isolated from the rest of their investigation.

The goal: bring network analysis directly into the platform, so analysts could move from data collection to a finished, exportable network visualisation without leaving their investigation.

 

What research told us

UX research with eight OSINT analysts mapped out how network analysis actually works in practice — these findings became the foundation for everything I designed.

  • Success means clarity and confidence, not metrics. Analysts judge a completed investigation by a sense of completion and certainty, not by numerical outputs.

  • Network analysis falls into two types: manual construction and automated refinement. We focused on the automated approach — gather data, then refine until it tells a clear story that analysts can shape and interrogate.

  • Dense graphs, fragmented data, and confusing labels were the core friction. Pruning was cognitively taxing, cross-platform collection was noisy, and CIB terminology confused analysts — influencing both the refinement tools and the naming decisions that came later.

1.png

From standalone tool to integrated experience

The first version of Networks was a separate feature — analysts could build and explore graphs, but it lived apart from the rest of their investigation. When the platform was rebuilt, I redesigned Networks from scratch as sole designer. The goal was to make network analysis part of the investigation flow rather than a detour from it — connected to narratives, content, geographic views, and coordination detection.

The user flow below previews what sits behind the interface — centrality metrics, clustering algorithms, and coordination signals, all abstracted away so analysts focus on the investigation, not the infrastructure.

2.png

Old platform Networks. Disconnected from investigation

New user flow. Complexity behind the interface

Giving analysts control without the complexity

Analysts need fine-grained control over what they're looking at, but network graphs are inherently overwhelming. The solution: filter by keywords, time frames, languages, and source platforms to minimise noise before building.

The key design decision: give analysts control over what kind of network to build without requiring them to understand graph theory. They choose node types (accounts, posts, URLs, hashtags) and edge types (follows, mentions, forwards, co-occurrence). Different combinations reveal different patterns.

3.png

Redesigned Networks. Part of the investigation flow, not a detour from it

Three layers of depth

The previous version dropped users into a full network visualisation immediately — visually impressive but overwhelming, with no clear starting point. The key design decision: surface the most important clusters first, then let analysts drill in.

The top-level overview highlights key clusters rather than showing everything at once. Drilling into a cluster reveals its members. At node level, clicking any element reveals full metadata — account bios, post text, URL patterns. Right-clicking expands connections without losing broader context.

But reading a graph is only half the job — the other half is shaping it into something presentable. Analysts colour-code clusters, size nodes by influence, and export as raw data or annotated images. The graph is what analysts present to stakeholders, so refinement and export aren't secondary — they're the final deliverable.

4.png

Drill into clusters and narratives to uncover meaningful insights

From reactive to proactive: coordination detection

Analysts relied on a separate CIB tool that was unstable, not scalable, and disconnected from the platform. In the redesign, coordination detection became a first-class feature — the AI automatically surfaces clusters of accounts exhibiting coordinated behaviour based on interaction patterns (coordinated posts, reposts), content similarity (coordinated hashtags and hashtag sequences), and account similarity (similar handles, posting patterns).

Each cluster surfaces with an AI-generated summary explaining why the group is of interest. The key design decision: the AI identifies different types of coordination and provides explainability, supporting the analyst's judgment rather than replacing it. Analysts can interrogate any cluster through AI chat — asking about coordination patterns, actors, TTPs, harmful narratives, and content toxicity without leaving the investigation. This surfaced coordination networks that analysts hadn't identified manually before.

5.png

Coordination signals. Detected and flagged, summarised by AI, interrogated through chat

Learnings and outcomes

Learnings

  • "Here's what we found" beats "go investigate." Proactive coordination detection fundamentally changed how analysts started their work — but the AI had to explain why it flagged something, not just that it did.

  • Analysts don't work in straight lines. Every flow I designed had to support looping back — build, analyse, refine, collect more data. The moment you enforce a sequence, you lose the analysts.

  • Naming shapes perception. "Coordination" over "CIB detection" — CIB implies a level of proof the system can't provide. The label defined what analysts trusted the feature to do.

Outcomes

Network graphs became the most exported feature on the platform over the three months following the changes.

 

The connected navigation and dashboards cut the time analysts needed to produce a report by roughly 50%.

Moving coordination detection from the standalone tool to the integrated platform reduced service desk tickets by 80%. The old CIB tool was the most complained-about feature — unstable, disconnected, and a constant source of support requests. Integration solved the problem at the root.

The three-layer depth model and AI chat changed how analysts work — more time investigating, less time orienting, and coordination networks surfaced that weren't found manually before.

Data availability remains the key limitation. Getting enough data to create meaningful coordination signals was the most consistent piece of feedback — the detection is only as strong as the data behind it.

Cover.png

© 2026 by Sima Marciuskaite

bottom of page