SGL: Deriving Test Case Generators using Domain-Specific Language to Test Database Engines

Abstract

Various automated testing approaches have been proposed for Database Management Systems (DBMS), which can automatically detect different kinds of bugs such as logic and performance bugs. Such approaches typically compare the results of executing two equivalent queries on the same database states; or two sequences of otherwise equivalent statements. Given that SQL dialects differ widely, these statement generators are typically manually written. For example, SQLancer is a popular DBMS-testing tool that provides multiple test oracles; it provides more than 20 database and query generators, which consist of more than 88,000 lines of code written in Java. It would be desirable to more concisely model these generators, and make them independent from the implementation details of the testing tool that uses them. However, existing grammar-based fuzzing approaches are inapplicable, as grammars lack important features required for DBMS testing tools, such as symbol relationship and repetition controls. In this work, we propose a domain-specific language to model database and query generators for automated testing tools. We have termed this language SQL Generation Language (SGL). Furthermore, we present a tool named Seagull, which can use SGL specifications to produce database and query generators.

Date
Nov 26, 2024 3:00 PM — 4:00 PM
Event
Weekly Talk
Location
COM3-02-59 - Meeting Rm 20
Tongjun Zhang
Tongjun Zhang
Undergraduate Student

Working on a Grammar-Based Fuzzer Framework for Generating SQL Test Cases From Grammar Specifications