Feifan Qu
Logo Reasearch Assistant @ HKU

I am a Research Assistant in the Department of Electrical and Electronic Engineering at The University of Hong Kong. Currently, I am conducting research in the Computational Imaging & Mixed Representation Laboratory (WeLight@HKU) led by Dr. Evan Y. Peng.

I completed my bachelor's degree in Electronic Information Engineering at Xidian University, and further pursued a master's degree in Electrical and Electronic Engineering at The University of Hong Kong under the supervision of Dr. Evan Y. Peng.

My current academic interests revolve around the areas of Holography, Computer Graphics, and XR.


Education
  • The University of Hong Kong
    The University of Hong Kong
    MSc in Electrical and Electronic Engineering
    Sep 2022 - Dec 2023
  • Xidian University &
    Xidian University &
  • École Polytechnique de l'Université de Nantes
    École Polytechnique de l'Université de Nantes
    Joint Bachelor's Program in Electronic Information Engineering
    Sep 2018 - Jul 2022
Experience
  • The University of Hong Kong
    The University of Hong Kong
    Research Assistant
    Feb 2024 - Present
    Student Researcher
    Sep 2022 - Aug 2023
  • Xidian University
    Xidian University
    Undergraduate Research Assistant
    Sep 2021 - June 2022
    Undergraduate Researcher
    Sep 2019 - Sep 2021
Extracurricular Activities
  • The University of Hong Kong
    The University of Hong Kong
    Lab Activity Coordinator
    Feb 2024 - Present
  • Xidian University
    Xidian University
    Secretary of the Communist Youth League Branch
    Sep 2018 - Jun 2022
    Director of Outreach Department in Student Union
    Sep 2019 - Sep 2020
  • Hebei Zhengding High School
    Hebei Zhengding High School
    Class Leader
    Sep 2015 - Jun 2018
    Academic Representative
    Sep 2015 - Jun 2018
    President of the Club Union
    Sep 2016 - Jun 2017
    President of Student Union
    Sep 2016 - Jun 2017
Honors & Awards
  • SID Display Week 2025 Student Travel Grant
    2025
  • Distinguished Dissertation Scholarship
    2023
  • The Funding Scheme for Student Projects/Activities
    2023
  • Academic Performance Scholarship
    2021
  • The National Challenge Cup Competition of University Students Tier 2
    2021
  • The Internet Plus Program on Innovation & Entrepreneurship of University Students Tier 1
    2021
  • China-France Inspirational Schooling Scholarship
    2020
  • Outstanding Secretary of the Communist Youth League Branch
    2019
News
2024
Thrilled to attend the China VR 2024 conference!
Nov 17
Delighted to assist Dr. Evan Y. Peng in hosting the mini-workshop on Frontiers of Geometry Computing & Visual Media 2024.
Nov 13
Honored to participate in the AIIP 2024 conference and present a poster!
May 02
May 02
Excited that my paper Towards Real-time 3D Computer-Generated Holography with Inverse Neural Network for Near-eye Displays has been accepted by the SID Symposium Digest of Technical Papers.
May 01
Attended the China 3DV 2024 conference.
Apr 24
Joined the WeLight Lab at HKU as a Research Assistant under the supervision of Dr. Evan Y. Peng.
Feb 19
2023
Graduated with a Master of Science in Electrical and Electronic Engineering from The University of Hong Kong.
Dec 05
Assisted Dr. Evan Y. Peng in organizing the workshop on Frontiers of Image Science and Visual Computing 2024
Sep 05
🎉 Awarded the Funding Scheme for Student Projects/Activities by the Tam Wing Fan Innovation Fund and Philomathia Foundation Innovation Fund for 2022-2023, receiving a total of HKD 40,000 from HKU InnoWing to support the research project Real-Time 3D Neural Holography for Near-eye Display .
Apr 02
Selected Publications (view all )
Off-axis holographic augmented reality displays with HOE-empowered and camera-calibrated propagation
Off-axis holographic augmented reality displays with HOE-empowered and camera-calibrated propagation

Xinxing Xia, Daqiang Ma, Xiangyu Meng, Feifan Qu, Huadong Zheng, Yingjie Yu, Yifan Peng

OPTICA Photonics Research 2025

Holographic near-eye augmented reality (AR) displays featuring tilted inbound/outbound angles on compact optical combiners hold significant potential yet often struggle to deliver satisfying image quality. This is primarily attributed to two reasons: the lack of a robust off-axis-supported phase hologram generation algorithm; and the suboptimal performance of ill-tuned hardware parts such as imperfect holographic optical elements (HOEs). To address these issues, we incorporate a gradient descent-based phase retrieval algorithm with spectrum remapping, allowing for precise hologram generation with wave propagation between nonparallel planes. Further, we apply a camera-calibrated propagation scheme to iteratively optimize holograms, mitigating imperfections arising from the defects in the HOE fabrication process and other hardware parts, thereby significantly lifting the holographic image quality. We build an off-axis holographic near-eye display prototype using off-the-shelf light engine parts and a customized full-color HOE, demonstrating state-of-the-art virtual reality and AR display results.

Off-axis holographic augmented reality displays with HOE-empowered and camera-calibrated propagation

Xinxing Xia, Daqiang Ma, Xiangyu Meng, Feifan Qu, Huadong Zheng, Yingjie Yu, Yifan Peng

OPTICA Photonics Research 2025

Holographic near-eye augmented reality (AR) displays featuring tilted inbound/outbound angles on compact optical combiners hold significant potential yet often struggle to deliver satisfying image quality. This is primarily attributed to two reasons: the lack of a robust off-axis-supported phase hologram generation algorithm; and the suboptimal performance of ill-tuned hardware parts such as imperfect holographic optical elements (HOEs). To address these issues, we incorporate a gradient descent-based phase retrieval algorithm with spectrum remapping, allowing for precise hologram generation with wave propagation between nonparallel planes. Further, we apply a camera-calibrated propagation scheme to iteratively optimize holograms, mitigating imperfections arising from the defects in the HOE fabrication process and other hardware parts, thereby significantly lifting the holographic image quality. We build an off-axis holographic near-eye display prototype using off-the-shelf light engine parts and a customized full-color HOE, demonstrating state-of-the-art virtual reality and AR display results.

3D-HoloNet: fast, unfiltered, 3D hologram generation with camera-calibrated network learning
3D-HoloNet: fast, unfiltered, 3D hologram generation with camera-calibrated network learning

Wenbin Zhou*, Feifan Qu*, Xiangyu Meny, Zhenyang Li, Yifan Peng (* equal contribution)

OPTICA Optics Letters 2025

Computational holographic displays typically rely on time-consuming iterative computer-generated holographic (CGH) algorithms and bulky physical filters to attain high-quality reconstruction images. This trade-off between inference speed and image quality becomes more pronounced when aiming to realize 3D holographic imagery. This work presents 3D-HoloNet, a deep neural network-empowered CGH algorithm for generating phase-only holograms (POHs) of 3D scenes, represented as RGB-D images, in real time. The proposed scheme incorporates a learned, camera-calibrated wave propagation model and a phase regularization prior into its optimization. This unique combination allows for accommodating practical, unfiltered holographic display setups that may be corrupted by various hardware imperfections. Results tested on an unfiltered holographic display reveal that the proposed 3D-HoloNet can achieve 30 fps at full HD for one color channel using a consumer-level GPU while maintaining image quality comparable to iterative methods across multiple focused distances.

3D-HoloNet: fast, unfiltered, 3D hologram generation with camera-calibrated network learning

Wenbin Zhou*, Feifan Qu*, Xiangyu Meny, Zhenyang Li, Yifan Peng (* equal contribution)

OPTICA Optics Letters 2025

Computational holographic displays typically rely on time-consuming iterative computer-generated holographic (CGH) algorithms and bulky physical filters to attain high-quality reconstruction images. This trade-off between inference speed and image quality becomes more pronounced when aiming to realize 3D holographic imagery. This work presents 3D-HoloNet, a deep neural network-empowered CGH algorithm for generating phase-only holograms (POHs) of 3D scenes, represented as RGB-D images, in real time. The proposed scheme incorporates a learned, camera-calibrated wave propagation model and a phase regularization prior into its optimization. This unique combination allows for accommodating practical, unfiltered holographic display setups that may be corrupted by various hardware imperfections. Results tested on an unfiltered holographic display reveal that the proposed 3D-HoloNet can achieve 30 fps at full HD for one color channel using a consumer-level GPU while maintaining image quality comparable to iterative methods across multiple focused distances.

All publications