Using SRTM
SRTM (System Response Time Measurement) is an RTAPI timer latency measurement tool that measures timer latency observed by an application.
Usage
srtm [/?] [/h] [/s] [/1] [/f]
[/n num]
seconds_to_sample
Parameters
/h
Display histogram (in addition to summary)
/s
Turn on sound (square wave driven by timer)
/1
Use a 10 MS timer period (default is 1 MS)
/f
Use fastest available timer (1MS or better)
/n num
Multiple SRTM instances aware, where num is the total number of instances
seconds_to_sample
Duration in seconds to sample timer response latencies.
/?
Help on usage
If no parameters are given, the default is srtm /h /f 60
Remarks
SRTM is also provided as sample code.
The real-time timer latency observed by an application is made up of hardware and software latency:
-
Hardware latency is the time it takes for a signal to be recognized by the HAL Interrupt Service Routine (ISR).
-
Software latency is the time it takes from an ISR to the routine running in an IST.
To mitigate the side effects of hardware latency on real-time timer latency and time, eRTOS uses a timer tick compensation algorithm. This algorithm uses the two Time Stamp Count (TSC) readings between previous and current ISRs to calculate the number of ticks. ISR then uses the calculated number of ticks (instead of 1 tick) to check user timer expiration and to increment real-time. If SMI or bus contention occurs at the early stage of user timer period, its handling routine will be called on time. If this contention occurs at the later stage, the current call is later, but the subsequent call will occur on time because of the reduced number of ticks to expire.
SRTM calculates the timer latency by subtracting the expected time from the time obtained by calling RtGetClockTime. The expected time is always incremented by the user timer period, instead of the previous time added to the user timer period. Without timer tick compensation, the time obtained by RtGetClockTime may be slower when compared with universal time. However, because real-time always increments by 1 tick, such time drift is not reflected in the difference of the expected time from the time obtained by RtGetClockTime. Without timer tick compensation, SRTM results may not fully reflect the side effect of SMI or bus contention. With timer tick compensation, the time obtained by RtGetClockTime is much closer to the universal time. Therefore, SRTM will fully reflect the side effect of SMI or bus contention.
Timer tick compensation is the default configuration. Timer tick compensation will decrease occurrences of user timer latency jitters, but it will not reduce the absolute value of jitters (that is, when SMI or bus contention occurs at the last tick of the user timer period).
Related topics: