The frist one is AlexNet. It use ReLU as activation funtion instead of Sigmoid, solving the gradient diffusion problem when the network gets deeper. And it novally take dropout mechanism to prevent overfitting. Moreover, the LRN layer is proposed to create a competition mechanism for the activities of local neurons, which makes the values with larger responses become larger, and inhibits other neurons with smaller feedback, thus enhancing the generalization ability of the model. The second one is VGG. It replaces LRN with ReLU in all hidden layers saving the time and memory consumption. It also employs smaller convolution kernels decreasing the number of parameters and building the networks much deeper. The last one is ResNet50. ResNet builds a noval structure using the shortcut connected directly from the input to the output of the residual block. This innovation offers a new way to solve gradient diffusion problem. The most simple algorithm among these three are AlexNet. It meets requirement at the soonest compared to VGG and ResNet with more complex structure. Also due to the simplenessof our dataset, the performance doesn’t raise explicitly when raising the complexity of network structure. As a consquence, AlexNet mey be the optimal choice of in these application senarios.
License Plate Recognition Based on Three Different Neural Networks
12.10.2022
1186961 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
Europäisches Patentamt | 2015
|LICENSE PLATE RECOGNITION SYSTEM AND LICENSE PLATE RECOGNITION METHOD
Europäisches Patentamt | 2020
|