3 Proposed Test Suite Design
5 Michael Hope (michaelh@juju.net.nz)
11 This article describes the goals, requirements, and suggested
12 specification for a test suite for the output of the Small
13 Device C Compiler (sdcc). Also included is a short list
18 The main goals of a test suite for sdcc are
20 To allow developers to run regression tests to check that
21 core changes do not break any of the many ports.
25 To allow developers to verify individual ports.
27 To allow developers to test port changes.
29 This design only covers the generated code. It does not cover
30 a test/unit test framework for the sdcc application itself,
33 One side effect of (1) is that it requires that the individual
34 ports pass the tests originally. This may be too hard. See
35 the section on Exceptions below.
41 The suite is intended to cover language features only. Hardware
42 specific libraries are explicitly not covered.
46 The ports often generate different code for handling different
47 types (Byte, Word, DWord, and the signed forms). Meta information
48 could be used to permute the different test cases across
53 The different ports are all at different levels of development.
54 Test cases must be able to be disabled on a per port basis.
55 Permutations also must be able to be disabled on a port
56 level for unsupported cases. Disabling, as opposed to enabling,
57 on a per port basis seems more maintainable.
61 The tests must be able to run unaided. The test suite must
62 run on all platforms that sdcc runs on. A good minimum may
63 be a subset of Unix command set and common tools, provided
64 by default on a Unix host and provided through cygwin on
67 The tests suits should be able to be sub-divided, so that
68 the failing or interesting tests may be run separately.
72 The test code within the test cases should not generate artifacts.
73 An artifact occurs when the test code itself interferes
74 with the test and generates an erroneous result.
78 sdcc is a cross compiling compiler. As such, an emulator
79 is needed for each port to run the tests.
85 DejaGnu is a toolkit written in Expect designed to test an
86 interactive program. It provides a way of specifying an
87 interface to the program, and given that interface a way
88 of stimulating the program and interpreting the results.
89 It was originally written by Cygnus Solutions for running
90 against development boards. I believe the gcc test suite
91 is written against DejaGnu, perhaps partly to test the Cygnus
92 ports of gcc on target systems.
96 I don't know much about the gcc test suite. It was recently
97 removed from the gcc distribution due to issues with copyright
98 ownership. The code I saw from older distributions seemed
99 more concerned with esoteric features of the language.
103 The xUnit family, in particular JUnit, is a library of in
104 test assertions, test wrappers, and test suite wrappers
105 designed mainly for unit testing. PENDING: More.
107 CoreLinux++ Assertion framework
109 While not a test suite system, the assertion framework is
110 an interesting model for the types of assertions that could
111 be used. They include pre-condition, post-condition, invariants,
112 conditional assertions, unconditional assertions, and methods
113 for checking conditions.
117 This specification borrows from the JUnit style of unit testing
118 and the CoreLinux++ style of assertions. The emphasis is
119 on maintainability and ease of writing the test cases.
123 PENDING: Align these terms with the rest of the world.
125 An assertion is a statement of how things should be. PENDING:
126 Better description, an example.
128 A test point is the smallest unit of a test suite, and
129 consists of a single assertion that passes if the test
132 A test case is a set of test points that test a certain
135 A test suite is a set of test cases that test a certain
140 Test cases shall be contained in their own C file, along
141 with the meta data on the test. Test cases shall be contained
142 within functions whose names start with 'test' and which
143 are descriptive of the test case. Any function that starts
144 with 'test' will be automatically run in the test suite.
146 To make the automatic code generation easier, the C code
147 shall have this format
149 Test functions shall start with 'test' to allow automatic
152 Test functions shall follow the K&R intention style for
153 ease of detection. i.e. the function name shall start
154 in the left column on a new line below the return specification.
158 All assertions shall log the line number, function name,
159 and test case file when they fail. Most assertions can have
160 a more descriptive message attached to them. Assertions
161 will be implemented through macros to get at the line information.
162 This may cause trouble with artifacts.
164 The following definitions use C++ style default arguments
165 where optional messages may be inserted. All assertions
166 use double opening and closing brackets in the macros to
167 allow them to be compiled out without any side effects.
168 While this is not required for a test suite, they are there
169 in case any of this code is incorporated into the main product.
171 Borrowing from JUnit, the assertions shall include
173 FAIL((String msg = "Failed")).
174 Used when execution should not get here.
176 ASSERT((Boolean cond, String msg = "Assertion
177 failed"). Fails if cond is false. Parent to REQUIRE
180 JUnit also includes may sub-cases of ASSERT, such as assertNotNull,
181 assertEquals, and assertSame.
183 CoreLinux++ includes the extra assertions
185 REQUIRE((Boolean cond, String msg = "Precondition
186 failed"). Checks preconditions.
188 ENSURE((Boolean cond, String msg = "Postcondition
189 failed"). Checks post conditions.
191 CHECK((Boolean cond, String msg = "Check
192 failed")). Used to call a function and to check
193 that the return value is as expected. i.e. CHECK((fread(in,
194 buf, 10) != -1)). Very similar to ASSERT, but the function
195 still gets called in a release build.
197 FORALL and EXISTS. Used to check conditions within part
198 of the code. For example, can be used to check that a
199 list is still sorted inside each loop of a sort routine.
201 All of FAIL, ASSERT, REQUIRE, ENSURE, and CHECK shall be
206 PENDING: It's not really meta data.
208 Meta data includes permutation information, exception information,
209 and permutation exceptions.
211 Meta data shall be global to the file. Meta data names consist
212 of the lower case alphanumerics. Test case specific meta
213 data (fields) shall be stored in a comment block at the
214 start of the file. This is only due to style.
216 A field definition shall consist of
222 A comma separated list of values.
224 The values shall be stripped of leading and trailing white
227 Permutation exceptions are by port only. Exceptions to a
228 field are specified by a modified field definition. An exception
229 definition consists of
233 An opening square bracket.
235 A comma separated list of ports the exception applies for.
237 A closing square bracket.
241 The values to use for this field for these ports.
243 An instance of the test case shall be generated for each
244 permutation of the test case specific meta data fields.
246 The runtime meta fields are
248 port - The port this test is running on.
250 testcase - The name of this test case.
252 function - The name of the current function.
254 Most of the runtime fields are not very usable. They are
255 there for completeness.
257 Meta fields may be accessed inside the test case by enclosing
258 them in curly brackets. The curly brackets will be interpreted
259 anywhere inside the test case, including inside quoted strings.
260 Field names that are not recognised will be passed through
261 including the brackets. Note that it is therefore impossible
262 to use some strings within the test case.
264 Test case function names should include the permuted fields
265 in the name to reduce name collisions.
269 I don't know how to do pre-formatted text in LaTeX. Sigh.
271 The following code generates a simple increment test for
272 all combinations of the storage classes and all combinations
273 of the data sizes. This is a bad example as the optimiser
274 will often remove most of this code.
276 /** Test for increment.
278 type: char, int, long
280 Z80 port does not fully support longs (4 byte)
284 class: "", register, static */
288 testInc{class}{types}(void)
292 {class} {type} i = 0;