5 May 2022
Seminar, Lecture, Talk
Department of Mathematics - PhD Student Seminar - Split Knockoffs: towards Controlling Directional False Discovery Rate under Transformations
Multiple comparisons in hypothesis test are often subject to structural constraints in applications. In structural Magnetic Resonance Imaging for Alzheimer's Disease, one studies not only the atrophy brain regions, but also comparisons of anatomically adjacent regions.
4 May 2022
Seminar, Lecture, Talk
Department of Mathematics - PhD Student Seminar - Unified Gas-kinetic methods for multi-scale flow
Multi-scale methods are constantly demanded for scientific research and industry application. This report reviews family of unified gas-kinetic methods for modeling multi-scale flows.
4 May 2022
Seminar, Lecture, Talk
Department of Mathematics - PhD Student Seminar - A Moving Mesh Finite Element Method for Topology Optimization
Many partial differential equations may have solutions with nearly singular behaviors, such as shock waves and boundary layers.
4 May 2022
Seminar, Lecture, Talk
Department of Mathematics - PhD Student Seminar - Highest weight crystals for Schur Q-functions
In 1990s, Kashiwara and Lusztig defined crystals as abstraction of crystal bases of quantum group representations.
4 May 2022
Seminar, Lecture, Talk
Department of Mathematics - PhD Student Seminar - Integration of single-cell atlases with generative adversarial networks
As single-cell technologies evolved over years, diverse single-cell atlas datasets have been rapidly accumulated. Integrative analyses harmonizing such datasets provide opportunities for gaining deep biological insights.
4 May 2022
Seminar, Lecture, Talk
Physics Department - Condensed Matter Seminar: What is “Qiu Ku” and How to Measure Quantum Entanglement with It
4 May 2022
Seminar, Lecture, Talk
Department of Mathematics - PhD Student Seminar - Provable Tensor-Train Format Tensor Completion by Riemannian Optimization
The tensor train (TT) format enjoys appealing advantages in handling structural high-order tensors. The recent decade has witnessed the wide applications of TT-format tensors from diverse disciplines, among which tensor completion has drawn considerable attention.
4 May 2022
Seminar, Lecture, Talk
Department of Mathematics - PhD Student Seminar - Feature Flow Regularization: Improving Structured Sparsity in Deep Neural Networks
Pruning is a model compression method that removes redundant parameters and accelerates the inference speed of deep neural networks while maintaining accuracy. Most available pruning methods impose various conditions on parameters or features directly.