Systems and techniques are provided for performing object detection using a machine learning model with a transformer architecture. An example method can include receiving a plurality of tokens corresponding to segmented sensor data; identifying, by a halting module within the machine learning model, at least one halted token from the plurality of tokens, wherein the at least one halted token is excluded from a plurality of non-halted tokens provided as input to a subsequent layer during inference of the machine learning model; and detecting, by the machine learning model, at least one detected object based at least on the plurality of non-halted tokens.


    Access

    Download


    Export, share and cite



    Title :

    TRANSFORMER ARCHITECTURE THAT DYNAMICALLY HALTS TOKENS AT INFERENCE


    Contributors:
    YE MAO (author)

    Publication date :

    2024-05-09


    Type of media :

    Patent


    Type of material :

    Electronic Resource


    Language :

    English


    Classification :

    IPC:    G06N COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS , Rechnersysteme, basierend auf spezifischen Rechenmodellen / B60W CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION , Gemeinsame Steuerung oder Regelung von Fahrzeug-Unteraggregaten verschiedenen Typs oder verschiedener Funktion



    To shorten omnibus halts

    Meyrick-Jones, L.M. | Engineering Index Backfile | 1928




    Flooding halts Semmering pilot tunnel

    Diewald, M. / Wolf, V. | Tema Archive | 1997