I will be on junior faculty research leave in 2024-2025 and am actively looking for visiting research positions. Please reach out via email (bmcdanel@fandm.edu)!

About Me

I am broadly interested in the areas of deep learning, hardware architecture, and computer networks.  I have worked on developing efficient deep neural network (DNN) algorithms and have designed hardware architectures for these DNNs. I am also interested in optimizing DNN inference (prediction) for edge devices in both standalone and distributed network contexts.

Recently, I have also starting exploring applying Large Language Models for program decompilation.


News


Recent Publications

ChatGPT as a Java Decompiler

B. McDanel and Zhanhao Liu
3rd Generation, Evaluation & Metrics (GEM) Workshop at EMNLP’23
paper
code

Accelerating Vision Transformer Training via a Patch Sampling Schedule

Bradley McDanel, Chi Phuong Huynh
preprint
code

the diagram shows how patch sampling is used
Accelerating DNN Training with Structured Data Gradient Pruning

B. McDanel, H. Dinh, J. Magallanes
International Conference on Pattern Recognition (ICPR), 2022.
preprint

the diagram shows how to use two separate squares
FAST: DNN Training Under Variable Precision Block Floating Point with Stochastic Rounding

S. Zhang, B. McDanel, H. T. Kung
28th IEEE International Symposium on High-Performance Computer Architecture (HPCA-28), 2022.
preprint

a diagram showing the different components of a computer system