Thu 18 Jul 2024 16:36 - 16:54 at Acerola - Log Analysis and Debugging Chair(s): Domenico Bianculli

As a powerful tool for developers, interactive debuggers help locate and fix errors in software. By using debugging information included in binaries, debuggers can retrieve necessary program states about the program. Unlike printf-style debugging, debuggers allow for more flexible inspection and modification of program execution states. However, debuggers may incorrectly retrieve and interpret program execution, causing confusion and hindering the debugging process.

Despite the wide usage of interactive debuggers, a scalable and comprehensive measurement on their functionality correctness does not exist yet. Existing works either fall short in scalability, or focus more on the “compiler-side” defects instead of debugger bugs. To facilitate a better assessment of debugger correctness, we first propose and advocate a set of debugger testing criteria, covering both comprehensiveness (in terms of debug information covered) and scalability (in terms of testing overhead). Moreover, we design comparative experiments to show that fulfilling these criteria is not only theoretically appealing, but indeed brings major improvement to debugger testing. Furthermore, based on these criteria, we present DTD, a differential testing (DT) framework for detecting bugs in interactive debuggers. DTD compares the behavior discrepancies of two mainstream debuggers when processing an identical C executable — discrepancies indicate bugs in one of the two debuggers.

DTD leverages a novel heuristic method to avoid the repetitive structures (e.g., loops) existed in C programs, which facilitates DTD to achieve full debug information coverage efficiently. Moreover, we have also designed Temporal Differential Filtering method to practically filter out the false positives caused by the uninitialized variables in common C programs. With these carefully designed techniques, DTD fulfills our proposed testing requirements, and therefore achieves high scalability and testing comprehensiveness. It for the first time offers large-scale testing for C debuggers to detect debugger behavior discrepancies when inspecting millions of program states. An empirical comparison shows that DTD finds 17× more error-triggering cases and detects 5× more bugs than the state-of-the-art debugger testing technique. We have used DTD to detect 13 bugs in the LLVM toolchain (Clang/LLDB) and 5 bugs in the GNU toolchain (GCC/GDB), with 4 bug fixes pushed for the latest LLDB by us. One of our fix has already landed in the latest LLDB development branch.

Thu 18 Jul

Displayed time zone: Brasilia, Distrito Federal, Brazil change

16:00 - 18:00
Log Analysis and DebuggingResearch Papers / Industry Papers at Acerola
Chair(s): Domenico Bianculli University of Luxembourg
16:00
18m
Talk
Go Static: Contextualized Logging Statement Generation
Research Papers
Yichen LI The Chinese University of Hong Kong, Yintong Huo The Chinese University of Hong Kong, Renyi Zhong The Chinese University of Hong Kong, Zhihan Jiang The Chinese University of Hong Kong, Jinyang Liu The Chinese University of Hong Kong, Junjie Huang The Chinese University of Hong Kong, Jiazhen Gu The Chinese University of Hong Kong, Pinjia He Chinese University of Hong Kong, Shenzhen, Michael Lyu The Chinese University of Hong Kong
16:18
18m
Talk
DeSQL: Interactive Debugging of SQL in Data-Intensive Scalable Computing
Research Papers
Sabaat Haroon Virginia tech, Chris Brown Virginia Tech, Muhammad Ali Gulzar Virginia Tech
16:36
18m
Talk
DTD: Comprehensive and Scalable Testing for Debuggers
Research Papers
Hongyi Lu Southern University of Science and Technology/Hong Kong University of Science and Technology, Zhibo Liu The Hong Kong University of Science and Technology, Shuai Wang The Hong Kong University of Science and Technology, Fengwei Zhang Southern University of Science and Technology
16:54
9m
Talk
Decoding Anomalies! Unraveling Operational Challenges in Human-in-the-Loop Anomaly Validation
Industry Papers
Dong Jae Kim Concordia University, Steven Locke , Tse-Hsun (Peter) Chen Concordia University, Andrei Toma ERA Environmental Management Solutions, Sarah Sajedi ERA Environmental Management Solutions, Steve Sporea , Laura Weinkam
17:03
18m
Talk
A Critical Review of Common Log Data Sets Used for Evaluation of Sequence-based Anomaly Detection Techniques
Research Papers
Max Landauer AIT Austrian Institute of Technology, Florian Skopik AIT Austrian Institute of Technology, Markus Wurzenberger AIT Austrian Institute of Technology
17:21
18m
Research paper
LILAC: Log Parsing using LLMs with Adaptive Parsing Cache
Research Papers
Zhihan Jiang The Chinese University of Hong Kong, Jinyang Liu The Chinese University of Hong Kong, Zhuangbin Chen School of Software Engineering, Sun Yat-sen University, Yichen LI The Chinese University of Hong Kong, Junjie Huang The Chinese University of Hong Kong, Yintong Huo The Chinese University of Hong Kong, Pinjia He Chinese University of Hong Kong, Shenzhen, Jiazhen Gu The Chinese University of Hong Kong, Michael Lyu The Chinese University of Hong Kong
DOI Pre-print
17:39
18m
Talk
TraStrainer: Adaptive Sampling for Distributed Traces with System Runtime StateDistinguished Paper Award
Research Papers
Haiyu Huang Sun Yat-sen University, Xiaoyu Zhang HUAWEI CLOUD COMPUTING TECHNOLOGIES CO. LTD., Pengfei Chen Sun Yat-sen University, Zilong He Sun Yat-sen University, Zhiming Chen Sun Yat-sen University, Guangba  Yu Sun Yat-sen University, Hongyang Chen Sun Yat-sen University, Chen Sun Huawei
Pre-print