Skip to content

software-competence-center-hagenberg/DumpSuite

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DumpSuite

DumpSuite is a web-based platform for core dump management and analysis. This platform supports the organization of core dumps and facilitates post-mortem debugging through interactive backtrace inspection with automated symbol fetching and source code integration. Furthermore, it visualizes relations between threads and resources in deadlocks and supports similarity-based retrieval of related dumps, facilitating developers to identify recurring issues. A demonstration video of DumpSuite is available at https://youtu.be/Jn7jcCAp6zw.

Architecture

DumpSuite is built on a modern containerized architecture, with all components hosted in Docker containers. Essentially, the platform comprises two main components: the UI implemented with Angular and the Core Dump Service, which is a REST service based on Java Spring. The service manages core dumps and provides an API for data manipulation operations, retrieving lists of core dumps, and retrieving detailed information for a specific core dump. Each core dump is represented as an aggregate stored in a MongoDB document store having a unique identifier assigned. For the UI view showing similar core dumps, a dedicated REST service, named Similarity Service, manages core dumps in a vector database. When a new core dump is uploaded, a document representing its key information is embedded and inserted into the database, enabling efficient similarity queries using a deep learning model.

Architecture diagram

Configuration

Before operation of DumpSuite, a few customisation steps are currently necessary. The individual components are operated as Docker containers. To facilitate deployment, a Docker Compose file is provided. The docker compose file directly allow to configure environment variables of the core-dump-service container. These are described below.

Environment Variables

Property Description Default Value
CORE_DUMP_STORAGE_PATH_ROOT Root directory where the dumps are stored within the container /app/Dumps
CORE_DUMP_STORAGE_PATH_EXTRACTED Subdirectory where the extracted core dumps are stored /extracted
CORE_DUMP_DEBUG_FILE_DIR Directory where the downloaded debug symbols are stored within the container /mnt/c/temp/usr/lib/debug/.build-id/
CORE_DUMP_TEMP_DIR Directory where the download of a core dump is prepared /tmp/download
CORE_DUMP_EMBEDDING_URI Address to the embedding service http://embedding:80
CORE_DUMP_EMBEDDING_TOP_K Number of how many similar core dumps are proposed 5
CORE_DUMP_SYMBOLS_REPOSITORY_URI Address to the repository where to search for the debian packages containing the debug symbols https://emsdps.scch.int
CORE_DUMP_SYMBOLS_CACHE_DIR Download directory for debian packages /tmp/symbols/download
CORE_DUMP_SYMBOLS_REPOSITORIES Repositories to scan for debian packages repo1,repo2,repo3
CORE_DUMP_COMPANY_NAMESPACE Lines containing this Namespace are highlighted in the backtrace visualisation scch::
CORE_DUMP_BACKTRACE_FILE Name of the backtrace file backtrace.txt
CORE_DUMP_BACKTRACE_FILE_FULL Name of the full backtrace file backtrace-full.txt
SPRING_DATA_MONGODB_URI Uri to the database mongodb://dump-suite-mongodb:27017/mydatabase
CORE_DUMP_REPOSITORY_URI Base Uri to the code repository https://dev.azure.com/Products/_apis/git/repositories
CORE_DUMP_REPOSITORY_TOKEN Access token to the repository
CORE_DUMP_FETCH_N_LINES Number of lines to show before and after the selected location 5

Mapping and Preport Definition

In addition to the environment variables, the mapping file must be defined in the service (see package-repository-mapping.json), which describes what Debian package name is mapped to which code repository. This mapping file must be defined according to the example JSON below (for each package name, there is a repository defined):

{
  "debian-package-1" : "repo-name-1",
  "debian-package-2" : "repo-name-2",
  "debian-package-3" : "repo-name-3"
}

Furthermore, the preport script must be implemented (the original script could not be made available for publication). The skeleton of this script can be found in the sh directory of the service. It describes the required functionality of the individual functions (which information to extract using GDB, such as backtrace). It is planned that this will no longer be necessary in future releases, as a default implementation will already be included.

Starting the Application

After the application is configured, the images can be built via docker compose build. The containers are started via docker compose up.

The UI is then available under http://<host>:8080.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors