Publications

Academic Research & Contributions

Research Papers

Peer-reviewed publications and preprints

ACCEPTED2025

Z-Pruner: Post-Training Pruning of Large Language Models for Efficiency without Retraining

Md. Samiul Basir Bhuiyan, Md. Sazzad Hossain Adib, Mohammed Aman Bhuiyan, Muhammad Rafsan Kabir, Moshiur Farazi, Shafin Rahman, Nabeel Mohammed
IEEE AICCSA 2025

This paper introduces Z-Pruner, a novel post-training pruning technique for Large Language Models that achieves significant efficiency improvements without requiring retraining. Our method addresses the computational challenges of deploying LLMs by strategically removing redundant parameters while maintaining model performance. Extensive experiments demonstrate that Z-Pruner can reduce model size and inference time substantially while preserving the quality of generated outputs.

Research Impact

Contribution to the academic community

1
Publications
1
Accepted/Published
7
Co-authors

Research Focus Areas

Key domains of academic contribution

Efficient AI Systems

Developing techniques for model compression and optimization, including pruning and quantization methods that maintain performance while reducing computational requirements.

Post-training pruning of Large Language Models
Model efficiency without retraining
Deployment optimization for edge devices

Computer Vision

Advancing temporal action localization and video understanding through transformer-based architectures and novel training methodologies.

Real-time action classification in videos
Temporal segmentation with transformers
State-of-the-art performance on benchmark datasets

Future Research Directions

I am actively exploring new research opportunities in efficient AI deployment, multimodal learning, and practical applications of large language models. I welcome collaborations and discussions on these topics.