Fri 19 Jul 2024 14:00 - 14:18 at Pitomba - Software Maintenance and Comprehension 4 Chair(s): Timo Kehrer

Neural code summarization leverages deep learning models to automatically generate brief natural language summaries of code snippets. The development of Transformer models has led to extensive use of attention during model design. While existing work has primarily and almost exclusively focused on static properties of source code and related structural representations like the Abstract Syntax Tree (AST), few studies have considered human attention — that is, where programmers focus while examining and comprehending code. In this paper, we develop a method for incorporating human attention into machine attention to enhance neural code summarization. To facilitate this incorporation and vindicate this hypothesis, we introduce EyeTrans, which consists of three steps: (1) we conduct an extensive eye-tracking human study to collect and pre-analyze data for model training, (2) we devise a data-centric approach to integrate human attention with machine attention in the Transformer architecture, and (3) we conduct comprehensive experiments on two code summarization tasks to demonstrate the effectiveness of incorporating human attention into Transformers. Integrating human attention leads to an improvement of up to 29.91% in Functional Summarization and up to 6.39% in General Code Summarization performance, demonstrating the substantial benefits of this combination. We further explore performance in terms of robustness and efficiency by creating challenging summarization scenarios in which EyeTrans exhibits interesting properties. We also visualize the attention map to depict the simplifying effect of machine attention in the Transformer by incorporating human attention. This work has the potential to propel AI research in software engineering by introducing more human-centered approaches and data.

Fri 19 Jul

Displayed time zone: Brasilia, Distrito Federal, Brazil change

14:00 - 15:30
Software Maintenance and Comprehension 4Research Papers / Demonstrations / Ideas, Visions and Reflections / Industry Papers at Pitomba
Chair(s): Timo Kehrer University of Bern
14:00
18m
Talk
EyeTrans: Merging Human and Machine Attention for Neural Code Summarization
Research Papers
Yifan Zhang Vanderbilt University, Jiliang Li Vanderbilt University, Zachary Karas Vanderbilt University, Aakash Bansal University of Notre Dame, Toby Jia-Jun Li University of Notre Dame, Collin McMillan University of Notre Dame, Kevin Leach Vanderbilt University, Yu Huang Vanderbilt University
14:18
18m
Talk
Predicting Code Comprehension: A Novel Approach to Align Human Gaze with Code Using Deep Neural Networks
Research Papers
Tarek Alakmeh University of Zurich, David Reich University of Potsdam, Lena Jäger University of Zurich, Thomas Fritz University of Zurich
DOI Pre-print
14:36
18m
Talk
R2I: A Relative Readability Metric for Decompiled Code
Research Papers
Haeun Eom Sungkyunkwan University, Dohee Kim Sungkyunkwan University, Sori Lim Sungkyunkwan University, Hyungjoon Koo Sungkyunkwan University, Sungjae Hwang Sungkyunkwan University
14:54
9m
Talk
CognitIDE: An IDE Plugin for Mapping Physiological Measurements to Source Code
Demonstrations
Fabian Stolp Hasso Plattner Institute, University of Potsdam, Malte Stellmacher Hasso Plattner Institute, University of Potsdam, Bert Arnrich Hasso Plattner Institute, University of Potsdam
Link to publication DOI Media Attached
15:03
9m
Talk
The lion, the ecologist and the plankton: a classification of species in multi-bot ecosystems
Ideas, Visions and Reflections
Dimitrios Platis Neat, Linda Erlenhov Chalmers | University of Gothenburg, Francisco Gomes de Oliveira Neto Chalmers | University of Gothenburg
Link to publication
15:12
18m
Talk
S.C.A.L.E: a CO2-aware Scheduler for OpenShift at ING
Industry Papers
Jurriaan Den Toonder TU Delft & ING, Paul Braakman ING, Thomas Durieux TU Delft