I am a Research Assistant in the Department of Electrical and Electronic Engineering at The University of Hong Kong. Currently, I am conducting research in the Computational Imaging & Mixed Representation Laboratory (WeLight@HKU) led by Dr. Evan Y. Peng.
I completed my bachelor's degree in Electronic Information Engineering at Xidian University, and further pursued a master's degree in Electrical and Electronic Engineering at The University of Hong Kong under the supervision of Dr. Evan Y. Peng.
My current academic interests revolve around the areas of Holography, Computer Graphics, and XR.
") does not match the recommended repository name for your site ("
").
", so that your site can be accessed directly at "http://
".
However, if the current repository name is intended, you can ignore this message by removing "{% include widgets/debug_repo_name.html %}
" in index.html
.
",
which does not match the baseurl
("
") configured in _config.yml
.
baseurl
in _config.yml
to "
".
Xinxing Xia, Daqiang Ma, Xiangyu Meng, Feifan Qu, Huadong Zheng, Yingjie Yu, Yifan Peng
OPTICA Photonics Research 2025
Holographic near-eye augmented reality (AR) displays featuring tilted inbound/outbound angles on compact optical combiners hold significant potential yet often struggle to deliver satisfying image quality. This is primarily attributed to two reasons: the lack of a robust off-axis-supported phase hologram generation algorithm; and the suboptimal performance of ill-tuned hardware parts such as imperfect holographic optical elements (HOEs). To address these issues, we incorporate a gradient descent-based phase retrieval algorithm with spectrum remapping, allowing for precise hologram generation with wave propagation between nonparallel planes. Further, we apply a camera-calibrated propagation scheme to iteratively optimize holograms, mitigating imperfections arising from the defects in the HOE fabrication process and other hardware parts, thereby significantly lifting the holographic image quality. We build an off-axis holographic near-eye display prototype using off-the-shelf light engine parts and a customized full-color HOE, demonstrating state-of-the-art virtual reality and AR display results.
Xinxing Xia, Daqiang Ma, Xiangyu Meng, Feifan Qu, Huadong Zheng, Yingjie Yu, Yifan Peng
OPTICA Photonics Research 2025
Holographic near-eye augmented reality (AR) displays featuring tilted inbound/outbound angles on compact optical combiners hold significant potential yet often struggle to deliver satisfying image quality. This is primarily attributed to two reasons: the lack of a robust off-axis-supported phase hologram generation algorithm; and the suboptimal performance of ill-tuned hardware parts such as imperfect holographic optical elements (HOEs). To address these issues, we incorporate a gradient descent-based phase retrieval algorithm with spectrum remapping, allowing for precise hologram generation with wave propagation between nonparallel planes. Further, we apply a camera-calibrated propagation scheme to iteratively optimize holograms, mitigating imperfections arising from the defects in the HOE fabrication process and other hardware parts, thereby significantly lifting the holographic image quality. We build an off-axis holographic near-eye display prototype using off-the-shelf light engine parts and a customized full-color HOE, demonstrating state-of-the-art virtual reality and AR display results.
Wenbin Zhou*, Feifan Qu*, Xiangyu Meny, Zhenyang Li, Yifan Peng (* equal contribution)
OPTICA Optics Letters 2025
Computational holographic displays typically rely on time-consuming iterative computer-generated holographic (CGH) algorithms and bulky physical filters to attain high-quality reconstruction images. This trade-off between inference speed and image quality becomes more pronounced when aiming to realize 3D holographic imagery. This work presents 3D-HoloNet, a deep neural network-empowered CGH algorithm for generating phase-only holograms (POHs) of 3D scenes, represented as RGB-D images, in real time. The proposed scheme incorporates a learned, camera-calibrated wave propagation model and a phase regularization prior into its optimization. This unique combination allows for accommodating practical, unfiltered holographic display setups that may be corrupted by various hardware imperfections. Results tested on an unfiltered holographic display reveal that the proposed 3D-HoloNet can achieve 30 fps at full HD for one color channel using a consumer-level GPU while maintaining image quality comparable to iterative methods across multiple focused distances.
Wenbin Zhou*, Feifan Qu*, Xiangyu Meny, Zhenyang Li, Yifan Peng (* equal contribution)
OPTICA Optics Letters 2025
Computational holographic displays typically rely on time-consuming iterative computer-generated holographic (CGH) algorithms and bulky physical filters to attain high-quality reconstruction images. This trade-off between inference speed and image quality becomes more pronounced when aiming to realize 3D holographic imagery. This work presents 3D-HoloNet, a deep neural network-empowered CGH algorithm for generating phase-only holograms (POHs) of 3D scenes, represented as RGB-D images, in real time. The proposed scheme incorporates a learned, camera-calibrated wave propagation model and a phase regularization prior into its optimization. This unique combination allows for accommodating practical, unfiltered holographic display setups that may be corrupted by various hardware imperfections. Results tested on an unfiltered holographic display reveal that the proposed 3D-HoloNet can achieve 30 fps at full HD for one color channel using a consumer-level GPU while maintaining image quality comparable to iterative methods across multiple focused distances.