Progressively Stacking Differentiable Architecture Search (PS-DARTs) forRecurrent Neural Networks (RNNs)
Du, Yubo
0000-0002-9153-7318
:
2021-11-19
Abstract
Accurate Multivariate Time Series (MTS) prediction supports a multitude of decision making tasks that impact our daily lives. However, current auto-regression-based approaches and deep learning models cannot deal with MTS that have multiple repetitive patterns. Inspired by the success of Differentiable Neural Architecture Search (DARTs), we propose a progressively stacking differential architecture search (PS-DARTs) to generate predictive models of MTS. Compared with DARTs, PS-DARTs extends its search space by extending the search space for the “optimal” architecture by including all activation functions, input components for each activation function, and decisions on whether to share the weights between two gates. This allows for many more architectures that can capture the dynamics of MTS more accurately. To avoid large increases in search time for the extended space of architectures, PS-DARTs conducts a sequential RNN node search. Compared with auto-regression-based approaches, deep-learning models, and DARTs, PS-DARTs achieves competitive or better results on various datasets with different repetitive patterns, in a similar search time as DARTs.