imgGroupCss_v">

Figure 3. Test case generation for AFLS.

Figure 4. Test case seen in the signal builder block.

>>dssensoraflslog1 = sldvlogsignals (‘dssensorafls_harness’)

>>save existingtestcaseforafls. mat dssensoraflslog1

5.1. Combined Harness Using Simulink Design Verifier

Thus an extended harness can be created for the model during the test harness step where the extra test cases generated help to increase the decision coverage of the model as seen in Figure 6. So that these test cases can be used during test analysis. For a test suite of a state chart to achieve full decision coverage all entry and exit paths through a state must be evaluated by test cases.

The model is tested in Design Error Detection mode to check whether there are any errors like division by zero or integer overflow errors present in the model developed and it took 17 s to do this analysis and no errors were found. When the Simulink Design Verifier is used in test generation mode, the transitions, conditions and state coverage are performed. The analysis time taken for this purpose is 11 s. Only one objective is proven unsatisfiable for this model out of 47 objectives and it is shown in Table 1.

The objective proven unsatisfiable may be due to the following reasons.

1) This often indicates the presence of dead-code in the model.

2) Other possible reasons can be inactive blocks in the model.

3) In rare cases, the approximations performed by Simulink Design Verifier can make objectives impossible to achieve.

5.2. Design Validation Using Reactis

In Reactis automatic test case generation and the coverage analysis can be performed [10] . In order to limit the values that should be considered for the input, it must be specified in the sub range of base type in the type for port dialog box. Five tests are performed on the model with a total of 27 steps in them and the test suite generated has a decision coverage of 100%, 97% condition coverage and 93% MCDC coverage for the model developed as seen in Figure 7.

The window as seen in Figure 8 shows that there is no unreachable code in the model but there are two uncovered codes.

Figure 5. Test harness.

Figure 6. Extended test cases.

Table 1. Objective proven unsatisfiable.

Reactis has generated five test cases where the first test case contains a single sample point input, the second test case has 3 sample points input, the third test case contains 7 sample points and the remaining two test cases contain 8 sample points. All these test cases produce correct results. When the tests developed using Reactis is made to run in Simulink, there is no error when interpolation is turned off and there will be some differences in the outputs generated in Simulink when compared to Reactis outputs when interpolation is turned on because in this case Simulink will sample inputs slightly differently than the original Reactis inputs which are shown in Figure 9 and Figure 10.

6. Test Analysis Using SystemTest

The SystemTest software provides a framework that integrates software, hardware, simulation, and other types of testing in a single environment [11] . Predefined elements are used to build test sections that simplify the development and maintenance of standard test routines. Tests can be shared and saved throughout a development project to ensure standard and repeatable test verification. The SystemTest software offers integrated data management and analysis capabilities for creating and executing tests, and saving test results to facilitate continuous testing across the development process. The SystemTest software automates testing in MATLAB and Simulink products. The SystemTest software offers graphical test editing and repeatable test execution. The SystemTest software allows viewing the results using a workspace variable called ST results. It provides access to the test results object, which is useful for comparing the results of separate test runs and for post-processing

Figure 7. Test harness using Reactis.

Figure 8. Coverage analysis using Reactis.

Figure 9. Differences in output from Reactis and Simulink when interpolation is turned on.

Figure 10. Differences in output from Reactis and Simulink when interpolation is turned off.

test results. In SystemTest the final process of test evaluation is performed where the test outputs will be compared with the expected outputs. During iteration 1 the input test vector is [0.01 9 0.3; 0.02 5 0.2; 0.03 1 0.6] and for iteration 2 [10 × 3] vector is taken as input and their results are seen in Figure 11 and Figure 12. The outputs generated in the SystemTest are same as the expected outputs from the model developed.

7. Conclusion

Manual work has been pushed to the background and day by day software is taking complete control in automotive industry. Model-Based Testing will be a blessing to rectify the errors in the software with less time consumption and yield products of high quality. This paper deals with the identification and removal of faults in automotive software development using Simulink Design Verifier and Reactis. Using these tools 100% decision

Figure 11. Iteration 1 in SystemTest.

Figure 12. Iteration 2 in SystemTest.

coverage is achieved by the developed model. Reactis shows a slight improvement when compared to Simulink Design Verifier with condition coverage of 97% and modified condition/decision coverage of 93%. Additional test cases can be added to the test suite by adjusting some parameters in the tools used to achieve full condition coverage and MC/DC coverage. SystemTest has been used to display the outputs obtained from Adaptive Front Light System in the form of graphs during the final test analysis procedure in Model-Based Testing. Thus the proposed method helps in doing the testing process at a faster pace. Today, software is being developed to regulate the speed of the vehicles upon seeing the unexpected obstacles while driving up or down in a mountainous terrain. Testing methods combined with Model-Based Testing are to be developed for this type of software which will be of an important value for a safer ride and upgrading the testing skills is needed.

NOTES

*Corresponding author.

Cite this paper
Sivakumar, P. , Vinod, B. , Devi, R. and Divya, R. (2016) Deployment of Effective Testing Methodology in Automotive Software Development. Circuits and Systems, 7, 2568-2577. doi: 10.4236/cs.2016.79222.
References
[1]   Pawel, S. and Gabriel, B. (2014) Model-Based Real-Time Testing of Embedded Automotive Systems. SAE International Journal, Passenger Cars—Electronic and Electrical Systems, 7, 337-344.
http://dx.doi.org/10.4271/2014-01-0188

[2]   Toshiaki, K. and Masato, S. (2008) Technical Trends and Challenges of Software Testing. Science & Technology Trends, Quarterly Review, No. 29, October 2008, 34-45.

[3]   Stürmer, I., Dziobek, C. and Pohlheim, H. (2008) Modeling Guidelines and Model Analysis Tools in Embedded Automotive Software Development. Dagstuhl-Workshop MBEES, Modellbasierte Entwicklung eingebetteter Systeme IV, Schloss Dagstuhl, Germany, 7-9 April 2008.

[4]   The MathWorks, Inc. (2015) Simulink Verification and Validation—User’s Guide, Version 2015b. The MathWorks, Inc., Natick, Massachusetts.

[5]   Reactive Systems, Inc. (2013) Testing and Validation of Simulink Models with Reactis. Reactive Systems, Inc., Cary, NC.

[6]   Miller, M.D. and Ulaszek, R.R. (2012) Model-based Testing Using Branches Decisions and Options. US Patent No. 8225288.

[7]   MISRA-C Guidelines (2007).
http://www.misra.org.uk

[8]   Justyna, Z., Ina, S. and Pieter, J.M. (2012) Model-Based Testing for Embedded Systems. CRC Press, Boca Raton.

[9]   Brett, M., Amory, W. and Jon, F. (2008) Best Practices for Verification, Validation, and Test in Model-Based Design. The MathWorks, Inc., Natick, Massachusetts.

[10]   Reactive Systems, Inc. (2015) Reactis—User Guide, Version 2015. Reactive Systems, Inc., Cary, NC.

[11]   The MathWorks, Inc. (2010) System Test User Guide, Version 2010a. The MathWorks, Inc., Natick, Massachusetts.

 
 
Top