Skip to Main Content U.S. Department of Energy
Website title

Technologies

PNNL’s analytics software offers unique capabilities in visualization, reasoning, and large-scale data analysis. Tools can be easily configured to operate in a variety of environments, and have been installed in several agencies for mission support. Most tools are available to federal agencies under a no-cost Government Use Agreement, and users may contract with PNNL for customizations, extensions, and technical support.

Canopy

Technologies_Canopy

Canopy is a suite of visual analytic tools designed to support deep investigation of large multimedia collections. Canopy combines the understanding of data represented in multiple formats: video, image, and text, and presents that information to users through new visual representations.

More

Users can explore relationships between documents and among subcomponents of documents. Canopy incorporates cutting-edge extraction techniques, state-of-the-art content analysis algorithms, and novel interactive information visualizations so that analysts can comprehend and articulate the big picture. Canopy helps to reduce analysts’ workload and ultimately the effort of identifying critical intelligence for decision makers.

Canopy:

  • Aids and expedites triage of multimedia data. For example, Canopy can help with analytic problems such as, “I have data from a large collection of files; help me investigate this collection to determine the most relevant files without my having to watch every movie, view every image, and read all the text.”
  • Bootstraps the analysis process by providing visual clues to potential data relationships and highlights connections, giving the user an understanding of all the data and additional context of its structure. This facilitates discovering previously unknown content and/or unexpected or non-obvious relationships.
  • Provides insight into multimedia content similarities and relationships by discovering and visualizing the relationships in an interactive and dynamic user interface.
  • Provides true multimedia analysis, not just stovepipe analysis of an individual type of data augmented with metadata. For example, a Word document with components such as embedded images and text is evaluated as a cohesive information item, where the association of these various document elements is preserved.

Canopy Flier ^Top

IN-SPIRE

Technologies_IN-SPIRE

IN-SPIRE™ provides tools for exploring textual data, including Boolean and “topical” queries, term gisting, and time/trend analysis tools. This suite of tools allows the user to rapidly discover hidden information relationships by reading only pertinent documents.

More
IN-SPIRE has been used to explore technical and patent literature, marketing and business documents, web data, accident and safety reports, newswire feeds and message traffic, and more. It has applications in many areas, including information analysis, strategic planning, and medical research.

Many analytical tools are provided to work in concert with the visualizations, allowing users to investigate the document groupings, query the document contents, investigate time-based trends, and much more.

IN-SPIRE Flier

Learn more ^Top

Analyst-driven Knowledge Enhancement & Analysis

Technologies_AKEA

Analyst-driven Knowledge Enhancement & Analysis (AKEA) is a next-generation knowledge visualization and analysis tool, encapsulating powerful semantic technologies in an intuitive environment.

More
Developed by PNNL, AKEA enables users to analyze available information, develop hypotheses, and evaluate scenarios by enabling them to:

  • collect relevant knowledge from text documents, databases, and knowledgeable experts
  • visualize networks, timelines, and spatial mappings of relationships and events
  • analyze networks of individuals, organizations, events, and other entities
  • store and retrieve knowledge from enterprise and personal data stores
  • query available knowledge to find patterns of analytic interest.

AKEA Flier ^Top

Scalable Reasoning System

Technologies_SRS

The Scalable Reasoning System (SRS) is an analytic framework for developing web-based visualization applications. Using a growing library of both visual and analytic components, custom applications can be created for any domain, from any data source.

More
SRS incorporates the simplicity and accessibility of web-based solutions with the power of an extensible and adaptable back-end analytics platform. SRS applications have been deployed to:

  • analyze unstructured text
  • explore hierarchical taxonomies
  • support real-time analysis of trends and patterns in streaming social media data
  • organize and provide visual search and navigation of large document repositories.

SRS Flier

Learn more ^Top

Universal Parsing Agent

Technologies_UPA

The Universal Parsing Agent (UPA) is a document analysis and transformation software program that accepts multiple information streams or datasets, finds and extracts the information needed, and delivers results in the format that will be most useful.

More
It is flexible and adaptable to individual user needs, and can be used to identify and extract very specific or very broad ranges of information. UPA was developed for a variety of U.S. government clients. Most recently a version was deployed at the Environmental Protection Agency to support a large web content management system. UPA may be used anywhere people fight battles with information overload. Applications currently range from supporting counterterrorism to commercial business intelligence efforts.

UPA Flier

Learn more ^Top

Common Operating Response Environment

Technologies_CORE

Agencies that oversee complex, multi-stakeholder programs need efficient, secure ways to link people and knowledge within and across organizations. The Common Operating Response Environment (CORE), a software suite developed by researchers at PNNL, does just that.

More
CORE assists users in getting the right information to the right people, both when and how they need it to relay data and make decisions that are critical to their organizations.

The CORE tool—which is customizable for a multitude of uses—facilitates situational awareness by integrating diverse data streams without the need to reformat them, summarizing that information, and providing users with the information they need to rapidly understand and appropriately respond to situations. It is mobile device-ready, has a straightforward interface for ease of use across organizations and skill sets, and is incredibly configurable to the needs of each specific user, whether they require data summaries for high-level decision makers or tactical maps, operational data, or weather information for responders in the field. Information can be input into CORE and queried in a variety of ways—using customized forms, reports, visuals, or other organizational templates—according to the needs of each user’s organization, teams, and business processes.

CORE Flier

Learn more ^Top

Tessera

Technologies_Tessera

Tessera enables data scientists to rapidly explore large, complex datasets, develop statistical and machine-learning algorithms, and validate their approaches using a familiar desktop programming environment.

More
Tessera automatically manages the complicated tasks of distributed storage and computation, empowering data scientists to do what they do best: tackling critical research and mission objectives by deriving intelligence from data. Many data challenges, large or small, are sufficiently complex to require a data scientist—a technical expert trained in statistics, applied mathematics, and computer science. Tessera is a software tool developed by data scientists for data scientists—to enable them to work more efficiently, develop better algorithms, and produce more accurate analyses.

Tessera Flier

^Top

Risk Reduction and Resource Assessment Model

Technologies_3ram

Washington State Patrol, Washington State Ferry System, and the U.S Coast Guard partnered with PNNL to pilot a Risk Reduction and Resource Assessment Model (3RAM) for security. 3RAM integrates information from disparate data sources to optimize risk reduction across the system in one hour increments.

More
This approach allows the user to determine baseline risk, resulting from an unmitigated decision strategy, versus the reduced risk based on security measures across the system for a given period of time. 3RAM is an approved part of the Washington State Ferry Alternative Security Plan. The allocation of security measures and resources in a risk informed manner as a function of time is an innovative security solution. 3RAM can be used in other transportation and infrastructure security applications especially when faced with limited security assets and measures.

3RAM ^Top

Trelliscope

Technologies_Trelliscope

Trelliscope is a powerful visualization tool for large data that allows users to rapidly render customized graphics that are both detailed and interpretable across large datasets—and then filter and sample the results to focus on examples that share common traits or characteristics.

More
Trelliscope is also very useful for visualizing small datasets as well. It is unique in that it is highly customizable and is based on trellis displays that guide users to focus on relatively small, rational subsets of data. This approach helps data scientists develop machine learning algorithms and predictive analytics that account for the scientific phenomenology displayed in the graphics—an essential requirement in many environments.

Learn more ^Top

Laboratory Integration Framework and Toolset

Technologies_Lift

An open-source enterprise integration framework and toolset that encapsulates, orchestrates, and exposes analytics and data as cloud services. This software enables the development and execution of signature discovery workflows.

More
For example, the user can build flexible workflows that:

  • link multiple data sources
  • transform data using perform feature extraction, data mining, and machine-learning tasks
  • validate learning algorithms and statistical models.

Learn more ^Top

Signature Quality Metrics

Technologies_SQM

Signature systems designed to detect, predict, or characterize a phenomenon of interest are developed by scientists and engineers in a wide variety of contexts. Regardless of the domain, every signature discovery effort needs a transparent approach for evaluating the quality of the resultant signature system.

More
The Signature Quality Metrics (SQM) methodology addresses this need by providing a holistic assessment of signature quality in terms of fidelity, cost, risk, other attributes, and utility—with the ultimate goal of identifying optimal signature systems for a particular application.

SQM Flier

Learn more ^Top

Graph Engine Multithreaded System

Technologies_GEMS

PNNL research staff are developing multithreaded graph algorithms and the programming models and runtime systems necessary to support them on conventional commodity computing systems.

More
Their work provides a scalable, cost-effective solution for big data problems that accommodates unstructured, in-memory datasets and allows for high performance of irregular applications. In particular, PNNL has designed a database that is closely aligned with future data trends and discovers complex relationships in unstructured data that other databases fail to find. The Graph Engine Multithreaded System (GEMS) database runs on commodity platforms from desktop computers to the cloud—requiring no special system requirements.

GEMS

Learn more

^Top

Velo

Technologies_VELO

Velo is a customizable, collaborative, project-centric, knowledge management and analysis framework based on commercial grade products. It provides an integrated environment for secure, collaborative data and knowledge management, analysis, visualization, and sharing.

More
Velo includes a flexible, rich content model that can define any object from file-based contributions to relational content. All content objects (data, knowledge, application, users, etc.) are represented as a semantic graph of nodes that connect objects together via a schema of domain-specified relationships, enabling not only to represent and utilize the relationships between data and knowledge objects, but also their link to analytical tools and workflows. Velo comes with several default schemas, including one to support provenance information to enable veracity checks for observational, simulated and analytical results. Importantly, Velo provides a tool integration framework that allows deployments to quickly integrate a wide variety of existing analytical tools including workflows, scripts, analysis, and visualization tools. Velo has been integrated with other semantic tools such as VisKo (Open Source Visualization Knowledge) in order to provide a more powerful search for tools that can be applied to a given dataset. The system also offers transparent remote processing capabilities, allowing the user to utilize remote data, database, HPC, and cluster resources, often eliminating the need to move data for further processing. The Velo core system can scale up to any demand by running in a clustered or cloud environment. In addition, Velo can federate data located at a variety of remote locations using its remote nodes, so data does not need to be contained by the Velo repository in order to be utilized by tools. Velo provides secure, role-based access to resources, and it can store classification levels for each resource in support of multi-level security. Velo provides a customizable desktop client as well as a web interface that can be easily tailored to meet the needs of each specific deployment; alternatively a customized client can be developed utilizing the Velo core system API’s.

Velo

Learn more

^Top

Adaptive Poisson-Boltzmann Software and PDB2PQR

Technologies_APBS

Adaptive Poisson-Boltzmann Software (APBS) and PDB2PQR are software packages designed to help analyze the solvation properties of small and macro-molecules such as proteins, nucleic acids, and other complex systems.

More
APBS and PDB2PQR work with a variety of desktop molecular graphics software packages and offer simple visualization capabilities via the web without the need for any additional software. The capabilities of these software packages include:

  • Repair and parameterize molecules: use PDB2PQR to add missing atoms as well as assign charge and radius parameters from a variety of force fields.
  • Assign titration states: use PDB2PKA or PROPKA to determine the titration state of proteins at specific pH values.
  • Calculation solvation properties: solve the Poisson-Boltzmann and related equations to calculate solvation energies and electrostatic properties for analysis and visualization.

Learn more ^Top

Visual Sample Plan

Technologies_VSP

Visual Sample Plan (VSP) is a software tool that supports the development of a defensible sampling plan based on statistical sampling theory and the statistical analysis of sample results to support confident decision making. VSP couples site, building, and sample location visualization capabilities with optimal sampling design and statistical analysis strategies.

More
VSP is currently focused on design and analysis for the following applications:

  • environmental characterization and remediation
  • environmental monitoring and stewardship
  • response and recovery of chemical/biological/radiation terrorist event
  • footprint reduction and remediation of unexploded ordnance (UXO) sites
  • sampling of soils, buildings, groundwater, sediment, surface water, subsurface layers.

VSP Flier

Learn more ^Top

T.Rex

Technologies_TRex

T.Rex is a visual analytics tool that simplifies the translation and exploration of unknown tabular data sources, adding knowledge through discovery. It is an assessment tool to help rapidly understand a previously unknown dataset, quickly identifying patterns of interest in the records, and annotating them to enable future collaboration. T.Rex contains a growing set of deep analytical tools and also supports robust export capabilities so that selected data can be sent to other specialized tools for further analysis.

Learn more ^Top

Machine Learning String Tools for Operational and Network Security

Technologies_MLSTONES

The Machine Learning String Tools for Operational and Network Security (MLSTONES) project has developed a mathematical formulation and computational infrastructure for developing new similarity metrics for cyber entities such as network transactions, source code, and instructions executing on a processor. Similarity metrics allow one to characterize the unknown (real-time use) and infer inheritance history (forensic or attributional use).

More
MLSTONES applications are implemented using data intensive computing to drive analysis at a high throughput. MLSTONES draws on two mature mathematical disciplines to enable string analysis that does not rely on exact or regular expression matching, or manually derived rules. These disciplines are:

  • bioinformatics: the analysis common inheritance of biological molecular sequences
  • support vector machine classification: an example of supervised learning.

Learn more ^Top

Analytics at PNNL