Two CS undergraduates, Juyoung Yun and Sol Choi, were the lead authors on a paper that won the Best Paper Award at the 2025 International Conference on Neural Information Processing (ICONIP), a long-running and well-regarded conference in the field of AI. The other authors were their supervisors: Profs. Francois Rameau, Prof. Byungkon Kang, and Prof. Zhoulai Fu.
The paper is “Revisiting 16-bit Neural Network Training: A Practical Approach for Resource-Limited Learning”. Modern AI models are usually trained in 32-bit “full precision,” which is accurate but expensive in terms of memory and computation. Their work shows that one can safely train neural networks entirely in 16-bit precision, without the usual 32-bit “safety net,” even on modest hardware. A practical recipe, validated by theory and extensive experiments, is given for choosing the right settings so that training remains stable and the final model quality matches (and sometimes even rivals) 32-bit training. This is particularly important for settings where computing resources are limited, such as smaller labs, edge devices, or student projects that cannot rely on large GPU clusters. By cutting the precision in half while preserving performance, our approach makes state-of-the-art neural network training more accessible and more energy-efficient. This efficiency and practicality were key reasons the paper received the award.