cv

Basics

Name Zefang Wang
Label Master's Student in Control Engineering
Email zefangwang@zju.edu.cn
Phone (+86) 18234041863
Url https://aden9460.github.io/Zefang-Wang/
Summary A Master's student at Zhejiang University focusing on model compression and efficient AI. Research interests include pruning techniques for visual autoregressive models and diffusion models, network quantization, and edge AI optimization.

Work

  • 2025.03 - 2025.11
    Visiting Student
    ENCODE Lab, Westlake University
    Advised by Prof. Huan Wang. Focused on iOS mobile model deployment and visual autoregressive model compression.
    • Conducted research on iOS model deployment
    • Developed compression methods for visual autoregressive models
  • 2023.08 - 2023.09
    Research Assistant
    Zhejiang University
    Industrial VR Glasses Backend Image Recognition Algorithm Development for Tower Group.
    • Built industrial electrical cabinet component dataset for detecting 17 types of electrical components
    • Implemented electrical cabinet label text recognition using Paddle OCR to verify device label correctness
    • Deployed algorithms on server with proficiency in Linux and Docker workflows

Education

  • 2023.09 - 2026.03

    Hangzhou, China

    Master
    Zhejiang University
    Control Engineering
  • 2019.09 - 2023.06

    Taiyuan, China

    Bachelor
    North University of China
    Rail Transit Signal and Control

Publications

  • 2026
    OBS-Diff: Accurate Pruning For Diffusion Models in One-Shot
    ICLR (Under Review)
    Fourth author. Proposed an OBS-based training-free multi-granularity pruning method for diffusion models with timestep-aware Hessian construction and logarithmically decreasing weights to mitigate error accumulation.
  • 2026
    EVAR: Edge Visual Autoregressive Models via Principled Pruning
    CVPR (Under Review)
    First author. Proposed a principled OBS-based structured pruning method for visual autoregressive models with progressive scale-aware distillation to address gradient imbalance. Achieved 1.8× speedup with only 10% quality loss on edge devices for single-image generation.
  • 2024
    Network Binarization via Contrastive Deep Supervision
    NCAA Conference
    First author. Proposed BNN-CDS, a binary network optimization method based on deep supervision and contrastive learning fusion, with a suitable scaling factor strategy.
  • 2024
    An Effective Information Theoretic Framework for Channel Pruning
    IEEE Transactions on Neural Networks and Learning Systems (TNNLS)
    Second author. Proposed an information-theoretic framework using entropy and rank fusion for layer-wise pruning rates, with Shapley value-based contribution evaluation as the intra-layer pruning criterion.

Skills

Deep Learning & Model Compression
Model Pruning
Network Quantization
Knowledge Distillation
Diffusion Models
Visual Autoregressive Models
Binary Neural Networks
Programming & Tools
Python
PyTorch
Linux
Docker
Git
Paddle OCR

Languages

Chinese
Native speaker
English
Professional working proficiency (CET-6)

Interests

Efficient AI
Model Compression
Edge AI
Mobile Deployment
Pruning Techniques
Network Quantization

Projects

  • 2024.01 - 2025.12
    EVAR: Edge Visual Autoregressive Model Compression
    Developed principled pruning methods for visual autoregressive models targeting edge device deployment. Introduced progressive scale-aware distillation to handle gradient imbalance during autoregressive fine-tuning.
    • 1.8× speedup on edge devices
    • 10% quality loss in single-image generation
    • OBS-based structured pruning
  • 2023.08 - 2023.09
    Industrial Electrical Cabinet Recognition System
    Developed an image recognition system for industrial VR glasses to detect and verify electrical cabinet components and labels.
    • 17 component types detection
    • OCR-based label verification
    • Server deployment with Docker