论文部分内容阅读
In this paper, we focus on inferring graph Laplacian matrix from the spatiotemporal sig-nal which is defined as“time-vertex signal”. To re-alize this, we first represent the signals on a joint graph which is the Cartesian product graph of the time-and vertex-graphs. By assuming the signals fol-low a Gaussian prior distribution on the joint graph, a meaningful representation that promotes the smooth-ness property of the joint graph signal is derived. Fur-thermore, by decoupling the joint graph, the graph learning framework is formulated as a joint optimiza-tion problem which includes signal denoising, time-and vertex- graphs learning together. Specifically, two algorithms are proposed to solve the optimiza-tion problem, where the discrete second-order differ-ence operator with reversed sign (DSODO) in the time domain is used as the time-graph Laplacian opera-tor to recover the signal and infer a vertex-graph in the first algorithm, and the time-graph, as well as the vertex-graph, is estimated by the other algorithm. Ex-periments on both synthetic and real-world datasets demonstrate that the proposed algorithms can effec-tively infer meaningful time-and vertex-graphs from noisy and incomplete data.