This paper addresses the challenges of mining latent patterns and modeling
contextual dependencies in complex sequence data. A sequence pattern mining
algorithm is proposed by integrating Bidirectional Long Short-Term Memory
(BiLSTM) with a multi-scale attention mechanism. The BiLSTM captures both
forward and backward dependencies in sequences, enhancing the model’s ability
to perceive global contextual structures. At the same time, the multi-scale
attention module assigns adaptive weights to key feature regions under
different window sizes. This improves the model’s responsiveness to both local
and global important information. Extensive experiments are conducted on a
publicly available multivariate time series dataset. The proposed model is
compared with several mainstream sequence modeling methods. Results show that
it outperforms existing models in terms of accuracy, precision, and recall.
This confirms the effectiveness and robustness of the proposed architecture in
complex pattern recognition tasks. Further ablation studies and sensitivity
analyses are carried out to investigate the effects of attention scale and
input sequence length on model performance. These results provide empirical
support for structural optimization of the model.
Este artículo explora los viajes en el tiempo y sus implicaciones.
Descargar PDF:
2504.15223v1