System measures timing errors between signals produced by three asynchronous time-code generators. Errors between 1-second clock pulses resolved to 2 microseconds. Basic principle of computation of timing errors as follows: central processing unit in microcontroller constantly monitors time data received from time-code generators for changes in 1-second time-code intervals. In response to any such change, microprocessor buffers count of 16-bit internal timer.


    Access

    Access via TIB

    Check availability in my library


    Export, share and cite



    Title :

    System Measures Errors Between Time-Code Signals


    Contributors:

    Published in:

    Publication date :

    1993-09-01



    Type of media :

    Miscellaneous


    Type of material :

    No indication


    Language :

    English