Confusion about the delay of a digitized signal
In a PSIM schematic shown below, the user reported the erroneous delay time at the output of the “Unit Delay”.
In this schematic, Sampling frequency in the “Unit Delay” block is “1/100nsec”. The expected delay is 100nsec.
The simulation result of this circuit is as below. The delay time is 150nsec instead of 100nsec.
Is it a bug in PSIM simulation? No, it is not a bug. Here below is the explanation.
Let us add a sinusoidal wave to illustrate the digitization of the signals.
The simulation result of this schematic is:
Both the square wave and the sinusoidal wave are sampled at each sampling point. The 100nsec delay starts from the sampling points, not from the square wave edges. Because the square wave edges come between the sampling points, the time between the edge and the sampling point results in additional 50nsec “delay”.
The “Unit Delay” block in this schematic digitizes the input at the specified sampling frequency. This block samples the input at the sampling point. Between sampling points, it is “blind” of any changes in the input signal.
Therefore, the delay should start at the sampling point. It should not start at the changing point of the input signal.
The correct delay is the delay time set in the parameter plus the time between the square wave edge and the sampling point. The correct delay should be 100nsec + 50nsec = 150nsec.