On the Implementation of a Cloud-Based Computing Test Bench Environment for Prolog Systems

Ricardo Gonçalves, Miguel Areias, Ricardo Rocha
2017 Information  
Software testing and benchmarking are key components of the software development process. Nowadays, a good practice in large software projects is the continuous integration (CI) software development technique. The key idea of CI is to let developers integrate their work as they produce it, instead of performing the integration at the end of each software module. In this paper, we extend a previous work on a benchmark suite for the YAP Prolog system, and we propose a fully automated test bench
more » ... vironment for Prolog systems, named Yet Another Prolog Test Bench Environment (YAPTBE), aimed to assist developers in the development and CI of Prolog systems. YAPTBE is based on a cloud computing architecture and relies on the Jenkins framework as well as a new Jenkins plugin to manage the underlying infrastructure. We present the key design and implementation aspects of YAPTBE and show its most important features, such as its graphical user interface (GUI) and the automated process that builds and runs Prolog systems and benchmarks. Information 2017, xx, x 2 of 17 systems such as B-Prolog [3], Ciao Prolog [4], Mercury [5], Picat [6], Sicstus [7], SWI-Prolog [8], XSB Prolog [9] and YAP Prolog [10] . This situation is entirely different from more recent languages, such as Java, Python or Perl, that either have a single implementation (Python and Perl) or are controlled centrally (Java implementations are only Java if they satisfy certain standards). The international standard for Prolog ISO/IEC 13211 [11] was created to standardize Prolog implementations. However, because of the different sources of development, the standard is not completely implemented in most Prolog systems. The Prolog community knows that different Prolog systems have different dialects with different syntax and different semantics for common features. A good example is Wielemaker's recent work on dictionaries and new string extensions to Prolog [12], which are not part of the ISO/IEC 13211. A different direction is that followed by Wielemaker and Santos Costa [13, 14] , who studied the status of the standardization of Prolog systems and gave a first step towards a new era of Prolog, for which all systems are fully compliant with each other. While this new era has not been reached yet, every publicly available significant piece of Prolog code must be carefully examined for portability issues before it can be used in any Prolog system. This creates a significant obstacle, if one wants to compare Prolog systems in performance and/or correctness measurements. Benchmark suite frameworks for Prolog have been around for some time [15, 16] , and several still exist that are specifically aimed to evaluate Prolog systems. Two good examples are China [17] and OpenRuleBench [18] . China is a data-flow analyzer for constraint logic programming languages written in C++ that performs bottom-up analysis, deriving information on both call patterns and success patterns by means of program transformations and optimized fixed-point computation techniques. OpenRuleBench is an open community resource designed to analyze the performance and scalability of different rule engines in a set of semantic Web information benchmarks. In a previous work, we also developed a first benchmark suite framework based on the CI and black-box approaches to support the development of the YAP Prolog system [10]. This framework was very important, mainly to ensure YAP's correctness in the context of several improvements and new features added to its tabling engine [19, 20] . The framework handles the comparison of outputs obtained through the run of benchmarks for specific Prolog queries and for the answers stored in the table space if using tabled evaluation. It also supports the different dialects of the B-Prolog and XSB Prolog systems. However, the framework still lacks important user productive features, such as automation and a powerful graphical user interface (GUI). In this paper, we extend the previous work and propose a fully automated test bench environment for Prolog systems, named Yet Another Prolog Test Bench Environment (YAPTBE), which aims to assist developers in the development and integration of Prolog systems [21] . YAPTBE is based on a cloud computing architecture [22] and relies on the Jenkins framework [23] and on a new Jenkins plugin to manage the underlying infrastructure. Arguably, Jenkins is one of the most successful open-source automation tools to manage a CI infrastructure. Jenkins, originally called Hudson, is written in Java and provides hundreds of plugins to support the building, deploying and automating of any project; it is used by software teams of all sizes, for projects in a wide variety of languages and technologies. YAPTBE includes the following features: (i) a GUI that coordinates all the interactions with the test bench environment; (ii) the definition of a cloud computing environment, including different computing nodes running different operating systems; (iii) an automated process to synchronize, compile and run Prolog systems against sets of benchmarks; (iv) an automated process to handle the comparison of output results and store them for future reference; (v) a smooth integration with state-of-the-art version control systems such as GIT [24]; and (vi) a publicly available online version that allows anonymous users to interact with the environment to follow the state of the several Prolog systems. To be best of our knowledge, YAPTBE is the first environment specifically aimed at Prolog systems that supports all such features. For simplicity of presentation, we focus our description on the YAP Prolog system, but YAPTBE can be used with any other system. The remainder of the paper is organized as follows. First, we briefly introduce some background on Prolog, tabled evaluations and the Jenkins plugin development. Next, we discuss the key ideas of
doi:10.3390/info8040129 fatcat:2kutw2hv7zgwrbmbpmijc5qzhi