Niara

No matter how many layers of network defense you put in place, you will eventually be hacked. The challenge is how quickly can you discover a security breach (not very quickly, it turns out), and how quickly you can respond. Niara was a startup focused on network security threat detection and incident response. It was acquired by Hewlett Packard Enterprise in 2017.

As the first UI guy...

My initial work was learning about the space, and how user tasks differed from my earlier experience with firewalls and VPNs. A major difference was that configuration took a backseat in terms of priority. This was an application, a tool in a security analyst’s toolbox intended for day-to-day use.

A user’s daily tasks were typically:

  • Alert-driven: Investigating and resolving security alerts
  • Entity-driven: Investigating and/or monitoring high-risk or high-value entities

Alerts and risk scores were generated by combining data from various network log sources and 3rd party threat feeds with behavioral analytics.

Here's a general sketch of the screen flow: Overview of UI architecture

Essentially, a user moves around in a space of lists (or search results) and detail screens showing information about (and links between):

  • Entities (users, devices, IP addresses)
  • Alerts (security-related events)
  • eFlows (network traffic, files, and packet captures)

Start Page

The first page an analyst would see when logging in was user-configurable, and went through many design revisions. This was the design I liked best. It’s meant to be a starting point, and not intended for display on a big monitor in a security operations center.

Dashboard screen

Overview: The card on the left gives the analyst an overview, with counts of high-risk entities (by type), or alerts (by category). Clicking on any item here filtered the data shown in the cards on the right. This made the filtering relationship easy to understand.

Cards: The cards on the right showed charts or lists. These could be configured and arranged to reflect a user’s preferred workflow. Lists provided direct links to relevant detail screens, and to the full result list.

List/Results (eFlows)

List pages provided 3 common panels:

Eflows list screen

Filters: The sidebar shows hit counts for various facets or fields, allowing a user to get deeper insights about the current results, and filter the list accordingly.

Visualization: The top panel provides space for various types of visualization of the results. Typically, this would be a time-series chart, but pie charts and maps could be useful as well.

List: The list panel shows a tabular view of the results. The values in each cell provided pop-up details with more info and relevant hyperlinks for pivoting.

Details (Alert)

Details varied based on the type of object being viewed, but in general consisted of several types of information, each of which provides various hyperlinks for pivoting:

Alert Details screen

Overview: This top area shows the same details one typically sees when the object appears in lists. Type, timestamps, source/destination, etc.

Info Cards: These cards are primarily textual. They might present additional info about the object, or info useful in an investigation (for example, pulling up relevant WHOIS or Certificate information).

Visualization Cards: These cards provide some graph or chart. They can be expanded for a larger, more interactive view.

List Cards These cards provide lists of related objects. They can be expanded for a larger view.

History

Investigation implies following leads, and the design provided rich cross-linking to support exploration and pivoting. False leads are not uncommon, so the ability to go back to a previous screen was important. It was also easy to get lost.

Navigation menu with MRU lists

Mega-menu Navigation: Of course, to go back, the Back button is always there. But the main navigation menu also provided lists of most-recent views and actions.

Configuration

As mentioned above, configuration UI was not a primary concern for design, partly because it’s not the main product focus, and partly because security analysts are often a completely different team from the folks who do the configuration.

Nonetheless, auto-generated UIs invariably suffer from poor usability and user experience. This eventually caught up with us. When we started adding more complexity for configuring data sources and rules for analytics, I did a complete survey of the existing config UI, discarded screens that were no longer relevant, and re-organized related functions and settings.

Configuration Overview screen

Information Scent: Besides describing a new information architecture for config, I added an overview page to provide visibility. From this screen, the user could:

  • get a high-level sense of what was configurable,
  • see what the current settings were, and
  • use deep links to each screen.