Kejun Tang (唐科军)
About me
Now I am a faculty member at Great Bay University (GBU). I obtained my PhD degree in the School of Information Science and Technology (SIST), ShanghaiTech University, (ShanghaiTech) under the supervision of Prof.Qifeng Liao. After that, I did postdoctoral research at Peng Cheng Laboratory. Before joining GBU, I worked at Shenzhen University of Advanced Technology and PKU-Changsha Institute for Computing and Digital Economy.
Collaborators
Education
ShanghaiTech University & University of Chinese Academy of Sciences, Sep 2015 - Jan 2021
School of Information Science and Technology
Louisiana State University, Feb 2019 - Aug 2019
Department of Mathematics &
Center for Computation and Technology, visiting student, under the supervision of Professor Xiaoliang Wan
YanTai University, Sep 2011 - July 2015
Department of Mathematics and Information Science
Research
My research interests include tensor methods, machine learning and scientific computing. In particular, I am working on
Low-rank tensor methods and applications
Density estimation and deep generative models
Deep learning methods and differential equations
Publications and preprints
Y. Wang, K. Tang, X. Wang, X. Wan, W. Ren, and C. Yang. Estimating committor functions via deep adaptive sampling on rare transition paths, preprint, 2025
C. Xiao, K. Tang, and Z. Zhu. Provable low-rank tensor-train approximations in the inverse of large-scale structured matrices, preprint, 2025
Z. Zhu, C. Xiao, K. Tang, J. Huang, and C. Yang. APTT: An accuracy-preserved tensor-train method for the Boltzmann-BGK equation, preprint, 2024
X. Wang, K. Tang, J. Zhai, X. Wan and C. Yang. Deep adaptive sampling for surrogate modeling without labeled data, Journal of Scientific Computing, 101 (3): 77, 2024
K. Tang, J. Zhai, X. Wan and C. Yang. Adversarial Adaptive Sampling: Unify PINN and Optimal Transport for the Approximation of PDEs, The International Conference on Learning Representations (ICLR), 2024
P. Yin, G. Xiao, K. Tang and C. Yang. AONN: An adjoint-oriented neural network method for all-at-once solutions of parametric optimal control problems, SIAM Journal on Scientific Computing, 46 (1): C127-C153, 2024
Y. Feng, K. Tang, X. Wan and Q. Liao. Dimension-reduced KRnet maps for high-dimensional Bayesian inverse problems, preprint, 2023
K. Tang, X. Wan and C. Yang. DAS-PINNs: A deep adaptive sampling method for solving high-dimensional partial differential equations, Journal of Computational Physics, 476 (2023): 111868
X. Wan and K. Tang. Augmented KRnet for density estimation and approximation, arXiv, June, 2021.
K. Tang, X. Wan and Q. Liao. Adaptive deep density approximation for Fokker-Planck equations, Journal of Computational Physics, 457 (2022): 111080.
Y. Feng, K. Tang, L. He, P. Zhou and Q. Liao. Tensor train random projection, Computer Modeling in Engineering and Sciences, 134(2), 1195–1218, 2022.
K. Tang, X. Wan and Q. Liao. Deep density estimation via invertible block-triangular mapping, Theoretical & Applied Mechanics Letters, 10 (3), 143-148, 2020.
K. Tang and Q. Liao. Rank adaptive tensor recovery based model reduction for partial differential equations with high-dimensional random inputs, Journal of Computational Physics, 409 (2020): 109326.
K. Li, K. Tang, T. Wu, and Q. Liao. D3M: A deep domain decomposition method for partial differential equations, IEEE Access, 8 (2019).
K. Li, K. Tang, T. Wu, J. Li and Q. Liao. A hierarchical neural hybrid method for failure probability estimation, IEEE Access, 7 (2019): 112087-112096.
Find out more.
my CV
|