Research | Brad McDanel

Research

Computer Science Education

Computer Science Education with a focus on how language models are impacting learning and assessment.


LLMs for Programming Languages

Exploring the intersection of LLMs and Programming Languages.


Speculative Decoding

Reducing LLM autoregressive bottleneck by utilizing additional drafting models.


Adaptive Inference

Varying the amount of computation based on sample difficulty.


Quantization of DNNs

Reducing the precision of weights and data to achieve better energy efficiency.


Sparse Computation in DNNs

Enforcing structural sparsity in DNN weights for reduced computation that can be performed efficiently.


Hardware Designs for DNN Inference

Mainly using systolic arrays, with a focus on integrating sparsity, quantization, and adaptive inference into the design.