. Bio

FAN=

Frontier of Artificial Networks

Bio

I am a Research Assistant Professor (non-tenure-tracked faculty, a highly-competitive position, can hire postdoc/RA/MPhil) in Department of Mathematics, The Chinese University of Hong Kong. Prior to this, I was a Postdoctoral Associate at Weill Cornell Medicine, Cornell University. I studied my PhD in a small but lovely university Rensselaer Polytechnic Institute (RPI), US, where I feel blessed to be advised by Dr. Ge Wang and have so many wonderful friends. Prior to RPI, I studied my undergraduate at Harbin Institute of Technology, China.
I am interested in deep learning, applied math, and image processing. My representative works are introducing neuronal diversity into deep learning and the width-depth equivalence of deep networks. Please see my Google Scholar for a topographic view. I am a frequent reviewer for AAAI, Artificial Intelligence Journal, IJCAI, IEEE TPAMI, IEEE TMI, IEEE TNNLS, IEEE TII. Recently, my research focuses on developing advanced deep learning models and their interpretation. Besides my research, I am a big fan of math and physics. I am also operating a Blog over WeChat together with my friends. If you are interested in my research, please feel free to reach me out (hitfanfenglei@gmail.com).

  • Phone: +852 84519576
  • City: Hong Kong
  • Degree: PhD
  • Email: hitfanfenglei@gmail.com

Education

  • 2017.9--2021.7: PhD, Rensselaer Polytechnic Institute, School of Engineering, advised by Prof. Ge Wang, US.
  • 2015.1--2015.6: Exchange student, National Chiao-Tung University, School of Instrumentation Science and Technology, Taiwan, China.
  • 2013.9--2017.6: Bachelor, Harbin Institute of Technology, School of Instrumentation Science and Technology, China.

Research Experience

  • 2022.12--present: Research Assistant Professor, Department of Mathematics, The Chinese University of Hong Kong, HK
  • 2021.9--2022.8: Postdoctoral Associate, Prof. Fei Wang’s Lab, Department of Computer Science, Cornell University, New York, NY, US
  • 2020.1--2020.8: Research Intern, Dr. Dimitry Krotov’s Group, MIT-IBM Watson AI Lab, Cambridge, MA, US
  • 2019.5 – 2019.8: Summer Intern, GE Global Research Center, Niskayuna, NY, US
  • 2016.9 – 2017.6: Research Associate, Prof. Jian Liu’s Lab, School of Precise Instrument, Harbin Institute of Technology, Harbin, Heilongjiang, China
  • 2016.6 – 2016.8: Visiting Student, Prof. Jean Michel Nunzi’s Lab, Department of Physics, Queen’s University, Kingston, Ontario, Canada
  • 2015.1 – 2015.6: Visiting Student, Prof. Chin-Wen Weng’s Group, Department of Applied Mathematics, National Chiao Tung University, Hsinchu, Taiwan, China

Honors and Awards

  • June 06, 2022: I was honored to be selected as the award recipient for the 2021 International Neural Network Society Doctoral Dissertation Award.
  • Apr, 2019: I am awarded an IBM AI Scholarship. IBM will support my research by covering my tuition and living expenses until graduation.
  • 2016: Congxin Scholarship, Harbin Institute of Technology (awarded to only two undergraduates annually).
  • 2014: Fujixerox Scholarship, Harbin Institute of Technology.

Talks and Presentations

  • Invited talk at CCF-EDA, Beijing, Oct 14, 2023
  • Invited talk at BICMR, Peking University, May 23, 2023
  • Invited talk at HIT Institute of Advanced Study in Mathematics, April 25, 2023
  • Invited talk at Fuzhou University, Apr 1, 2023
  • Tutorials on “Introducing Neuronal Diversity into Deep Learning” at AAAI2023 (AAAI is a top conference in the field of AI. Only approximately 20 tutorials are accepted annually.)
  • Invited talk at SUSTech, Jan 12, 2023
  • Invited talk at Northeastern University, Jan 8, 2023
  • Invited talk at Fudan University, Dec 4, 2022.
  • Invited talk at HIT Institute of Advanced Study in Mathematics, November 24, 2022
  • Invited talk at National Biomedical Imaging Center, Peking University, October 27, 2022.
  • Invited talk at IFMI & ISPEMI 2022, hosted by Chinese Academy of Engineering, August 10, 2022
  • Invited talk at Summer School of Xiamen University, July 15, 2022.
  • Invited talk at School of Math, Harbin Institute of Technology, April 2022.
  • Invited talk at SCF-YSSEC, State Key Laboratory of Scientific and Engineering Computing, China, November 2021 (http://scf.cc.ac.cn/yssec2021/).
  • Invited talk at FDA, May 2021.
  • Invited job talk at Weill Cornell Medicine, Cornell University, January 2021.
  • Invited job talk at Department of Mathematics, Duke University, December 2020.
  • Poster presentation at fully3D 2019, Philadelphia, PA, June 2019.

Publications

  • Fan FL, Li M, Wang F, Lai R, and Wang G: Expressivity and Trainability of Quadratic Networks. IEEE Transactions on Neural Networks and Learning Systems, 2023 in press (IF=14.25).
  • Fan FL, Lai RJ, Wang G: Quasi-Equivalency of Width and Depth of Neural Networks. Journal of Machine Learning Research, 2023 in press (My PhD advisor Prof. Ge Wang listed this paper as one of his 16 representative papers among his 700+ publications )
  • Wu T, Wu W, Yang Y, Fan FL*, and Zeng T*: Retinex Image Enhancement Based on Sequential Decomposition With a Plug-and-Play Framework. IEEE Transactions on Neural Networks and Learning Systems, 2023 in press (IF=14.25).
  • Wang D, Fan FL, Wu Z, Liu R, Wang F, & Yu H: CTformer: Convolution-free Token2Token Dilated Vision Transformer for Low-dose CT Denoising. Physics in Medicine and Biology, 68(6), 2023.
  • Niu C, Li M, Fan FL, Wu W, Guo X, Lyu Q, & Wang, G. Noise Suppression with Similarity-based Self-Supervised Deep Learning. IEEE Transactions on Medical Imaging, 2022 (IF=10.04).
  • Liao JX, Dong HC, Sun ZQ, Sun J, Zhang S* and Fan FL*: Attention-embedded Quadratic Network (Qttention) for Effective and Interpretable Bearing Fault Diagnosis, IEEE Transactions on Instrumentation and Measurement, 2023 (China Instrument and Control Society A journal).
  • Zhang SQ, Wang F, and Fan FL*: Neural Network Gaussian Processes by Increasing Depth. IEEE Transactions on Neural Networks and Learning Systems, 2022 in press (IF=14.25).
  • Niu C, Cong W, Fan FL, Shan H, Li M, Liang J, & Wang, G: Low-dimensional Manifold Constrained Disentanglement Network for Metal Artifact Reduction. IEEE Transactions on Radiation and Plasma Medical Sciences, 2022
  • Fan FL, Wang D, Guo H, Zhu Q, Yan P, Wang G and Yu H: On a Sparse Shortcut Topology of Artificial Neural Networks. IEEE Transactions on Artificial Intelligence, 2021 (recognized as “an essential contribution of deep learning” by reviewers).
  • Fan FL, Xiong J, & Wang G: On Interpretability of Artificial Neural Networks: A Survey. IEEE Transactions on Radiation and Plasma Medical Sciences, 2021 (highly cited according to Google Scholar, most popular article for two years in this journal).
  • Fan FL, Li M, Teng Y, Wang G: Soft Autoencoder and Its Wavelet Adaptation Interpretation. IEEE Transactions on Computational Imaging, 6:1245-57, 2020.
  • Fan FL, Xiong J, & Wang G: Universal approximation with quadratic deep networks. Neural Networks, 124, 383-392, 2020 (China Academy of Science top journal).
  • Fan FL, Wang G: Fuzzy logic interpretation of quadratic networks. Neurocomputing, 2019.
  • Fan FL, Shan H, Kalra M K, Singh R, Qian G, Getzin M, Teng Y, Hahn J, and Wang G: Quadratic Autoencoder (Q-AE) for Low-dose CT Denoising. IEEE Transactions on Medical Imaging, 39(6):2035-50, 2019 (IF=10.04).
  • Cheng YJ, Fan FL, Weng C: An extending result on spectral radius of bipartite graphs. Taiwanese Journal of Mathematics, 22(2): 263-274, 2018 (alphabetical order, published when I am an undergraduate).
  • Fan FL, Cong W, and Wang G: A new type of neurons for machine learning. Int. J. for Number. Method. in Biomed. Eng., 34.2, e2920, 2018 (the first paper on introducing neuronal diversity into deep learning, 41 citations according to Google Scholar).
  • Fan FL, Cong W, & Wang G: Generalized backpropagation algorithm for training second-order neural networks. Int. J. for Number. Method. in Biomed. Eng. 34.5, e2956, 2018.
  • Fan FL, Weng C: A characterization of strongly regular graphs in terms of the largest signless Laplacian eigenvalues. Linear Algebra and its Applications, 506: 1-5, 2016 (published when I am an undergraduate).

Preprints

  • Liao JX, Hou BJ, Dong HC, Zhang H, Ma J, Sun J, Zhang S*, Fan FL*. (2022). Heterogeneous Autoencoder Empowered by Quadratic Neurons. arXiv preprint arXiv:2204.01707 (co-corresponding author, major revision in IEEE Transactions on Neural Networks and Learning Systems, IF=14.25).
  • Wang D*, Fan FL*, Hou BJ, Zhang H, Lai R, Yu H, Wang F: Manifoldron: Direct Space Partition via Manifold Discovery. arXiv preprint arXiv:2201.05279, 2022 (co-first author, major revision in IEEE Transactions on Neural Networks and Learning Systems, IF=14.25).
  • Fan FL, Li Y, Peng H, Zeng T, & Wang F. Towards NeuroAI: Introducing Neuronal Diversity into Artificial Neural Networks. arXiv preprint arXiv:2301.09245, 2023 (submitted to Proceedings of IEEE).
  • Chen A, Zhang J, Rahaman MM, Sun H, Zeng T, Grzegorzek M, Fan FL*, Li C*: ACTIVE: A Deep Model for Sperm and Impurity Detection in Microscopic Videos. arXiv preprint arXiv:2301.06002, 2023 (reject and resubmit in IEEE Transactions on Artificial Intelligence).
  • Fan FL, Dong HC, Wu Z, Ruan L, Zeng T, Cui Y, and Liao JX: One Neuron Saved Is One Neuron Earned: On Parametric Efficiency of Quadratic Networks. arXiv preprint arXiv:2303.06316, 2023 (major revision in IEEE TPAMI).
  • Zhang H, An X, He Q, Yao Y, Fan FL*, & Teng Y* (2023). Quadratic Graph Attention Network (Q-GAT) for Robust Construction of Gene Regulatory Networks. arXiv preprint arXiv:2303.14193. (co-corresponding author, submitted to Pattern Recognition).
  • Cui Y, Ruan L, Dong HC, Li Q, Wu Z, Zeng T, & Fan FL: Cloud-RAIN: Point Cloud Analysis with Reflectional Invariance. arXiv preprint arXiv:2305.07814 (submited to IEEE Transactions on Neural Networks and Learning Systems, IF=14.25).
  • Fan FL, Li ZY, Xiong H, & Zeng T: Rethink Depth Separation with Intra-layer Links. arXiv preprint arXiv:2305.07037, 2023 (alphabetical order, submited to NeurIPS).
  • Fan FL, Huang W, Zhong X, Ruan L, Zeng T, Xiong H, & Wang F: Deep ReLU Networks Have Surprisingly Simple Polytopes. arXiv preprint arXiv:2305.09145, 2023 (submited to NeurIPS).

Teaching

  • MATH6251: Topics in Mathematical Data Science