figshare
Browse

Large Language Model-based suggestion of objective functions for search-based Product Line Architecture design

Download all (1.73 MB) This item is shared privately
dataset
modified on 2024-08-13, 11:07

This work delves into the complex domain of Product Line Architecture (PLA) optimization, a critical area in Software Engineering (SE) focused on enhancing the design and functionality of software product lines through variability management and reuse. A particular challenge in this area is the selection of optimal objective functions, a core aspect of interactive Search-Based Software Engineering (SBSE) that significantly influences the success of search-based PLA design. Despite its importance, this task remains daunting due to the vast design space and the critical impact of chosen objectives on the optimization outcome. Language Learning Models (LLMs), particularly the Generative Pre-trained Transformer series (GPT), have shown promising results to help in various SE tasks. This paper explores the integration of such LLMs, notably ChatGPT, into the search-based PLA design. By leveraging LLMs' capacity to understand and generate human-like text, we investigate their potential to assist decision-makers (DMs) by proposing a module of suggestions of objective functions, thereby simplifying and improving decision-making in PLA design optimization. Through empirical tests combined with qualitative feedback from domain experts, this research highlights the application of LLMs in SE and maps out the challenges and opportunities that lie ahead in fully harnessing their potential for PLA search-based design. Exploratory testing and expert feedback confirm ChatGPT's potential in suggesting objective functions for search-based PLA design.