DYNAMIC mainly due to economic reasons [1]. The CES



We Will Write a Custom Essay Specifically
For You For Only $13.90/page!

order now

Professor and Head

Electronics and Communication Engineering

Matrusri College of Engineering
Saidabad Hyderabad Telangana 500059 India [email protected]


Abstract: In the past decade, along with growing embedded systems complexity, the
industry’s perspective on the inadequacy of current system design methods and
requirement of handling verification issues has gathered strengths due to
economic reasons. This paper presents a practical application of dynamic evaluation method for handling the design verification issues in
embedded system applications. This includes dedicated instrumentation for test
case generation and evaluation. A practical case study of Signal Processing system describes the dynamic evaluation
with the support of dedicated instrumentation and functional test case


Index Terms — Embedded Systems, Design verification
instrumentation, Dynamic evaluation, Input domain model




systems, while becoming omnipresent and their complexity is growing at
unprecedented rates in several market segments. The industry’s perspective on
the inadequacy of current system design methods and requirement of handling
verification issues has gathered strengths mainly due to economic reasons 1. The CES system design process is complex in a literary sense and in the
initial phase of development the CES uncertainty
is highest 2-3. In the initial stages of development the cost of
rectifying defects is relatively low. The introduction of system models and
applying testing techniques much earlier in the lifecycle reduces the
uncertainty and cost of development. Introduction of formality at higher
levels, provide wider choices of model based tests to become a reality, viz.,
specification to design, to other levels of development. In 3 an incremental verification paradigm is
proposed that introduces integration early in the design cycle and reduces
uncertainty. Design verification is an essential step in the development of any
system, especially for Complex Embedded Systems. Verification is the process of showing that
system meets its specification. If specifications or test oracles are
sufficiently precise and informative, verification can be a more objective
technique than validation, which is used to validate user requirements.

This paper proposes dynamic evaluation method
for design verification of an embedded system application. The section 2 describes
related works. An overview of the dynamic
evaluation method for design
verification of embedded system applications is proposed in section 3. The
section 4 briefly introduces functional tests and describes proposed procedure
for Test case generation. The example case-study of Harbour Surveillance System described in section5 is
used for illustrating the ideas proposed in this paper. Typical Test Sequence
generated and expected outputs are shown.

Related Works

A great
deal of effort was put in developing methodologies for all aspects of embedded
systems product development life cycle 4, 5. The authors 4 focus on product
validation testing, combinatorial approaches and boundary conditions. Examining Complex and Embedded systems and
software testing, it explains the use of simulation and emulation to complement
testing. A practical overview of embedded software testing in a structured way
is presented by the authors 5 along with ingredients of testing process and
test design techniques. In 6 referring to the research directions for integration in embedded
and cyber-physical systems, the authors point deficiencies for high-quality and
error-free systems integration. They suggest a relook on integration associated
challenges and provide summary of current advances.

The verification and validation techniques that
are reported can be categorized into (a) language dependent 7 (b) special
tool / framework) 8-9, commercial tools 10 and also as instrumentation /
environment 11. Many
different kinds of techniques and tools have been developed to answer specific
testing concerns of embedded Systems 8-9. The author in his thesis 8 proposes visual modelling of system
along with its environment for complete verification of models and executable
codes.  UPPAAL 9 verifies embedded
systems that can be modeled as networks of timed automata extended properly.
Such approaches may work in the presence of the required specifications; they
are not applicable in many cases in which appropriate specifications do not

The author describes a commercially available
tool “Test Weaver” used by automobile industry 10. The tool approaches the
task of system test and validation based on a combination of closed-loop system
simulation and autonomous, intelligent exploration of the reachable system
states. Published application reports indicate that the tool provides high test
coverage with a relatively low work effort required for the test specification.
In other example the
authors present ESIDE 11, an integrated development environment for
component-based embedded systems. Many researchers have created techniques for
using dynamic execution information to verify system correctness 12 using
program properties. In
practice, a large percentage of faults involve problems with functional
behavior, configuration errors, and misuse of resources in critical sections 13.

At present several researchers are pursuing verification
issues of Embedded Systems. This paper is also is based on dynamic evaluation
for design verification but uses dedicated instrumentation and functional tests
for test case generation. This method is useful for verification complete

Dynamic Evaluation Method

V-model constitutes the reference product development life cycle (PDLC) model
for the development and testing of embedded systems in several application
domains. System test can be broadly categorized as (i) development and (ii)
operation tests. From specification to design to development tests fall under
“verification category”. Verification is a process for determining the
performance of a product design relative to the specifications on which the
design was based upon 14. More generally it is
evaluation of whether or not a product, service, or system complies with a
regulation, requirement, specification, or imposed condition. Though test generation
techniques are being proposed for verification problems 15, testing alone as
a means of verifying is rarely sufficient, and cannot guarantee correctness. Design verification starts with an implementation
and confirms that the implementation meets its specifications. The basic
principle behind verification consists of two steps: the first is a
transformation (verification transformation) from specifications to an
implementation; and in the second step, the result from the verification is
compared with the result from the design for detecting any errors. Obviously,
the more different the design and verification paths, the higher confidence the
verification produce. In this paper dynamic
evaluation is considered for achieving design verification of embedded
system under development. Dynamic
evaluation is defined very broadly 16: in which either the object or the method of
evaluation is changing and dynamic. However for this paper the more restricted
definition, “Dynamic evaluation is the process of verifying a system (referred
to as the target system) while it is executing”   is being followed. A stepwise dynamic
evaluation method for the verification of embedded system is illustrated in
figure 1. The ‘design and prototype’ step indicates any general embedded system
development method like 17-18 or rapid-prototype 19 may be used. The
prototyping itself is a development methodology according to IEEE, with clear
advantages 20. Else rapid-prototype (temporary or limited design) may be
used. Both approaches have their merits 20-21. Alternately an adaptive design
method 22 that supports step-wise design and verification can be adopted, if
the requirements are hazy.

includes parallel development of ‘target modular hardware architecture’ and
application software. If the application (product) permits selection of
‘modular hardware architecture’ then the prototype will eventually become the
final system design. Development of software involves plan and sequencing of
implementing algorithms that are computationally intensive, require more
resources, but also improve system performance. The prototype development, as
mentioned earlier, will have multiple levels of integration phases 23
whatever may be the implementation architecture and technology; and they are to
be supported during the dynamic evaluation. The focus of the paper is on design
verification using dynamic evaluation method. For verification, detailed
observations of the ES execution behaviour are checked at runtime against
properties that specify the intended system behaviour. These properties are
generally from the target application’s requirement specification, or sometimes
the properties to be monitored may form the entire specification. The targets
ES itself must be instrumented to report events that are relevant for
monitoring that the properties are satisfied on a particular execution. This
instrumentation is based on the architecture of the Monitoring method.























The architectural requirements for
evaluation, in addition to the target embedded system prototype are: (i) input
test-sequences (ii) monitoring and (iii) verification mechanisms. The test case
generator takes input from the Domain model of the of the application domain or
can be directly generated based on a model. The model should further describe
mapping from input values to properties: for each input element, the model
defines what properties an execution on that input should satisfy.  The test case generator generates input
test-sequences and expected response for the application. These architectural
elements of ES evaluation are depicted in figure 2.














A PC based dedicated real time Verification Instrumentation for Test and
Evaluation with the following features is reported by the author(s) 24:

1.       Test case generation architecture
simulates and generates Test case data off-line and inputs test sequence data
in real time. Input domain model based on dynamic test scenarios is used for
simulation in MATLAB.

2.       The Hybrid Monitoring system
architecture is distributed between the target-ES Boards and “separate and
dedicated card-cage with separate mother board” as shown in figure3.

3.       On separate mother board of target
system card-cage, Monitoring and acquisition port (MAP1) and handshake signals
are terminated.  On the boards the
Monitoring and acquisition port (MAP2) and handshake signals are terminated on
a separate edge-connector located at the top of the board, which need not be
used in the actual operation. Similarly on the mother-board. The software
(code) does not interfere as the MAP’s are not in use in the actual operation.

4.       The Instrumentation can monitor/
acquire intermediate and other levels outputs of the prototype system under
evaluation. The Instrument architecture is shown in figure 4.

















instrumentation is required for dynamic evaluation of prototype along with test
case generation and expected output response by the prototype based on
input-domain. Readers referred to 24 for more details of hardware and software,
and other data acquisition /monitoring of data for evaluation. In the next
section an acoustic detection system case study used to describe the functional
test methodology, to complete the dynamic evaluation of ES.




















In most Sonar systems, Harbour
surveillance to Ship-systems 25-26, the all-important signal processing
subsystem is Acoustic
Detection Subsystem, which normally operates in several modes of operation and
popularly called Multi-mode
Detection System (MDS). As per selected mode of operation Sonar
Transmitter transmits (if active) a selected type of signal. Otherwise the
passive signal only is received by the Receiver.   MDS operates on the
data received from Digital Receiving Beam-former (DRB) for detecting target(s)
and estimate target parameters. The main functions of MDS are (i) Acquire data
from DRB (ii) Perform Detection Algorithm based on mode (iii) Perform Constant
False Alarm Rate (CFAR) (selected) algorithm (iv) Perform Post Detection
Processing(PDP) (v) Generate target parameters. MDS block diagram is given in
figure 5. For more details on Sonars readers are referred to 26.


The Multi-mode Detection subsystem functional
requirement, target detection problem can be written as binary hypothesis
testing 27, where H0
indicates absence of target and H1 its presence.


where x (t) is the observed or received
signal, r (t) is the reverberation noise generated by the transmitted signal,
and n (t) represents white noise. The signal transmitted in the active
mode s (t) can be a PCW / LFM / any
other signal and e (t) =f{s (t)} is the reflected echo of s (t). In passive
mode s (t) =0 and r (t) = 0; the
signal of interest is self-generated noise-signature of the target that is
buried in n (t). The solution to this
problem provides Detection algorithms 28, under given assumptions of noise and reverberation statistics.
Interested readers may refer 29 for more information on models of noise and
reverberation; and for some detection algorithms like Replica Correlation. The generation
of signal x (t) by modeling implies the
complete input domain of the application 28-29.

The signal y (t)
at the input of Digital
Receive Beam former is transformed into beam outputs z (n, zm), where zm is the pointing angle of main response axis
of the beam; m is an integer indicating the number of parallel beams formed.
The various beam outputs that are inputs to Multi-mode Detection subsystem are
represented by xm (t) and x (t) represents any of the beam
outputs. For passive mode no active signal s (t) transmission takes place. As x
and y (t) are related to each other linearly, the modelling and
generation of y (t) serves as
test-sequence.  However some components of y (t) are characterized in a statistical
sense only; e.g., noise or reverberation. The parameters of performance for Multi-mode Detection Subsystem are statistical
quantities of the application context viz. ocean environment’s physical
phenomena characterized by the noise, reverberation and echo, during active
mode. These models are required for generating the received signal y (t) or x (t). This requires modeling and generation of noise, reverberation and
echo signals and several parameters are involved, giving rise to typically
several sets of test cases whose parametric values range from minimum, typical,
and maximum. The modelling issues are very similar and details are given
in 29. Figure 6 illustrates input data generation for MDS.















Case Study: Test-Data
Sequences Generation:

The input model generates one beam, one PRI data belonging to single target and multi
beam model generates single target multiple beams of data for storing in PC. The
VITE 24 can input data to MDS or its (PROTO)i and acquire output
data, else, it can be stored in a file. This data can be taken as an input by
application software for evaluating a detection algorithm and the output data
can be observed. The data is generated offline using MATLAB. Examples of data
sequence generated for typical operational conditions are shown in this
section.  Only Gaussian ambient noise
input signal with TVG is shown in Figure 7(a). The x-axis is in time and the
y-axis shows amplitude in numerical values. In Figure7 (b) ambient noise plus
reverberation for a PCW signal is shown. The axis markings are similar. The
pulse width used is 60 ms and data is generated for one PRI of 3.75seconds.                                 





                Figure 7(a): Ambient
noise input;                                     (b)
Ambient noise input signal with TVG.





8 (a) PCW echo                                                        (b)
LFM echo with Noise.


Figure 8 shows typical echoes generated by transmitted PCW and LFM type
of signals for a PW of 120 ms with additive nose. For the Ambient
Noise plus Reverberation (LFM PW=120ms) with TVG data is given as an input to Correlator
and Cell averaging constant false alarm rate (CA-CFAR) threshold computation
algorithms. The input data and the Correlator output in blue color are shown in
figure 9. The computed threshold is plotted in the same figure in red color.
The expected data is on left side of the figure and the actual outputs are on
the right side. It is to be noted that the input data is first generated using
MATLAB and then later modified for giving as input to the














                MDS requires beam former output
data (beam /channel data) as input. Taking transmitted signal as
LFM of 120 ms and target echo at 500M, simulated data generated for channel 4 (
beam 4) is shown figure 9(a). Using this three adjacent channels data generated
is shown in figure 10. (b) to (d).            The Ambient
Noise plus Reverberation (LFM PW=120ms) with TVG plus echo data generated via
MATLAB is shown in figure 11(a). This data is converted to digital form and
given as an input to target-board for computing Replica Correlator with exact
reference and the output data is plotted in blue colour figure 11(b). On the
Correlator output data CA-CFAR threshold computation algorithm is applied and
threshold is computed. The threshold computed data is plotted in red colour are
shown in figure 11 (c). Target echo presence at 1900M is clearly shown in the
figure. Note the x-axis is in sample numbers and y-axis is the value in 16


This paper proposed dynamic evaluation method for design
verification of an embedded system application. The dynamic evaluation
architecture described and it involves test case generation based on functional
testing. An overview of the basic steps of embedded system Test case generation
is described. The example case-study of Harbour Surveillance System is used for illustrating
the ideas presented. Test cases generated, design verification outputs for
certain input data are presented.


1.     RTI.
The Economic Impacts of Inadequate Infrastructure for Software Testing.
Technical report, National Institute of Standards and Technology, May 2002

Zimmermann, Markus, et al. “On the design
of large systems subject to uncertainty.” Journal of Engineering Design
28.4 (2017): 233-254.

Hara Gopal Mani Pakala (2017)-An Incremental
Verification Paradigm for Embedded Systems. 40-49. 10.1007/978-981-10-5427-3_5

Kim H. Pries, Jon M. Quigley, Testing complex and
embedded systems. 2010 CRC Press Taylor & Francis Group.

Bart Broekman and Edwin Notenboom. Testing Embedded Software.
Addison-Weseley, 2003

Feiler, Peter, et al. “Architecture-led
Diagnosis and Verification of a Stepper Motor Controller.” 8th European Congress on Embedded Real Time
Software and Systems (ERTS 2016). 2016

Elshuber, Martin, Susanne Kandl, and Peter Puschner.
“Improving System-Level Verification of SystemC Models with SPIN” 1st
French Singaporean Workshop on Formal Methods and Applications (FSFMA 2013).
Eds. Christine Choppy, and Jun Sun. Vol. 31. Schloss
Dagstuhl–Leibniz-ZentrumfuerInformatik}, 2013.

Nokovic, Bojan. Verification and Implementation of Embedded Systems from High-Level
Models. Diss. 2016

G. Behrmann, A. David, and K. G. Larson. A tutorial
on UPPAAL. http://www.uppaal.com, 2004.

10.  Tatar, Mugur, and
JakobMauss. “Systematic Test and Validation of Complex Embedded
Systems.” ERTS-2014, Toulouse (2014): 05-07.

11.  Nicholas T.
Pilkington, Juncao Li, and Fei Xie, “ESIDE: An Integrated Development
Environment for Component-Based Embedded Systems” 33rd Annual IEEE Intl.
Conference COMPSAC 2009. http://www.computer.org/portal/web/csdl/doi/10.1109/COMPSAC.2009.48

12.  Willem Visser,
Klaus Havelund, Guillaume Brat, SeungJoon Park, and Flavio Lerda. Model
checking programs. Automated Software Engineering, 10:203–232, 2003.

13.  Nigel Jones. A
taxonomy of bug types in embedded systems, October 2009. http://embeddedgurus.com/stack-overflow/-2009/10/a-taxonomy-of-bugtypes-in-embedded-systems/

14.  System
verification: proving the design solution satisfies the requirements, by
Jeffrey O. Grady, 2007, Elsevier Inc.

15.  Prab Varma, “Design Verification Problems: Test To The
Rescue?” Proceedings of the IEEE International Test Conference 2003

16.  Web Dictionary of
Cybernetics and Systems: http://pespmc1.vub.ac.be/ASC/Dynami_Evalu.html

17.  Xuan F Zha and Ram
D Sriram “Platform-based product design and development: A knowledge-intensive
support approach”, Knowledge-Based Systems, Volume 19, Issue 7, November 2006,
Pages 524-543

18.  Lesley Shannon,
Blair Fort, Samir Parikh, Arun Patel, Manuel Salda˜na and Paul Chow, “A System
Design Methodology For Reducing System Integration Time And Facilitating
Modular Design Verification”. FPL2006

19.  Rui Wang and
Shiyuan Yang, “The Design of a Rapid
Prototype Platform for ARM Based Embedded System”, IEEE Transactions on
Consumer Electronics, Vol. 50, No. 2, MAY 2004, p746-751.

20.  Fabrice Kordon and
Luqi, “An Introduction to Rapid System Prototyping”, IEEE Transactions On
Software Engineering, Vol. 28, No. 9, September 2002, p817-821.

21.  Baldwin, Carliss
Y. and Kim B. Clark. “Modularity in the Design of Complex Engineering
Systems.” In Complex Engineered Systems: Science Meets Technology, edited
by Ali Minai,  Dan Braha and Yaneer Bar
Yam. Springer-Verlag, 2006.

22.  Hara Gopal Mani
Pakala, Dr PLH VaraPrasad ,  Dr. Raju
KVSVN , and Dr. Ibrahim Khan, “An Adaptive Design Verification Methodology for Embedded Systems” International
Journal of Ad hoc, Sensor & Ubiquitous Computing (IJASUC) Vol.2, No.3,
September 2011.

23.  Abel Marrero
P´erez, Stefan Kaiser, “Integrating Test Levels for Embedded Systems”,
Proceedings TAIC-PART’09. 2009 Testing: Academic and Industrial Conference –
Practice and Research Techniques, p184-193.

24.  Hara Gopal Mani
Pakala, Dr PLH VaraPrasad ,  Dr. Raju
KVSVN , and Dr. Ibrahim Khan, “Development of Instrumentation for Test and
Evaluation of a Signal Processing System”. Journal of the Instrument Society of
India, 40 (1) March 2012.

25.  Harbor
Surveillance Systems – http://www.dsit.co.il/siteFiles/1/84/5379.asp  

26.  Ashley Waite,
“Sonar for Practising Engineers” 3rd edition , John Wiley & Sons
Ltd, 2002.

27.  Michael A.
Ainslie, “Principles of Sonar Performance Modeling”. Springer–Praxis Books,

28.  PC Etter –
Underwater acoustic modeling and simulation. Spon Press; 3 edition (April 4,

29.  Pascal A.M. de
Theije, Hans Groen, “Multi static Sonar Simulations with SIMONA” Proceedings of
9th International Conference on Information Fusion, July 2006,
Florence, Italy.