Published by the Students of Johns Hopkins since 1896
April 16, 2024

Visting PhD candidate presents research on machine learning

By PETER NOVELLO | February 20, 2020

b9-machine-learning

COURTESY OF CAS GUSTAFFSON

Hui Guan of North Carolina State University presented her work on reuse-centric optimization.

Machine learning has proliferated virtual personal assistants, online video streaming services and social media platforms, providing users access to personalized technologies. 

Hui Guan, a doctoral candidate of Electrical and Computer Engineering at North Carolina State University, presented her research discoveries on the intersection of machine learning and programming systems in a talk on Feb. 13. 

The talk, titled “Reuse-Centric Programming System Support of Machine Learning,” was for the Department of Computer Science. It examined the use of machine learning to enhance the models and predictions of various computing systems. 

Guan discussed the methods she employed to optimize the opportunities to reuse machine learning processes and models and programming systems.

Guan started her talk by describing recent breakthroughs in machine learning, including Microsoft achieving human performance on the General Language Understanding Evaluation (GLUE) benchmark, an assembly of resources used to examine natural language understanding systems, in 2019. 

Guan then presented the audience with a question: What are the prospects of 2020 witnessing a breakthrough in machine learning systems?

She presented the diverse applications of machine learning, describing its tangible use in self-driving cars and medical care, as well as the many ubiquitous forms of cloud technology. Guan’s research concentrates on improving the efficiencies of the computing models supporting the proliferation of these machine learning systems through the combined use and study of programming systems and machine learning algorithms.

Guan introduced the concept of reuse-centric optimization, which involves the reuse of algorithms, implementations and infrastructures supporting machine learning processes. The significance of these different programming components were articulated in the findings of her work on convolutional neural networks (CNNs).

Guan provided the example of a CNN as a computing tool used to distinguish an image of a dog from one of a cat. Her research studying the intersection of programming systems and machine learning processes involved pruning CNNs to enhance machine learning efficiencies, which culminated in the Wootz compiler.

Guan posited that a pre-trained CNN is essential to fine tune a data set, which consists of removing parameters and retraining the CNN to choose the best “candidate” for reuse. Use of a pre-trained CNN significantly reduces the time consumed by repeating the process. 

Guan noted that some challenges she has encountered in her work include the automatic generation of code to support reuse, concurrent training of parallel tuning blocks and the composability of systems to allow for greater scalability.

In an interview with The News-Letter following her talk, Guan noted that another challenge of her research is needing experience in both machine learning and programming systems. However, she finds this challenge to be rewarding. 

“It is very nice to be in this interdisciplinary area to collaborate with people from both sides,” she said.

Despite the technical challenges of modifying the efficiency of machine learning processes, Guan’s research observations have addressed the time consuming efforts of pruning CNNs. 

Guan stated that there was a 90 percent initial increase in accuracy in the pruned CNN variants, with pre-trained layers reaching accuracy quicker. 

Overall, Guan reported that her research demonstrated a speedup of 186 times in CNN pruning, a phenomenon she considers as saving manual efforts.

“[It is necessary to] look at deep learning through the lens of compiler or programming systems... and to identify reuse opportunities and translate them to performance gain via programming system support,” she said.

Another aspect of Guan’s research involves the memory reuse of CNNs. Guan emphasized that CNNs are used in safety application systems, such as those found in self-driving cars, and that memory faults threaten the reliable predictors of these systems. 

Guan noted that possible causes of memory faults include the surrounding environment, temperature and manufacturing defects. 

Guan also admitted that the current error correction code presents a large spatial cost and is oblivious to CNNs.

She proposed zero-space memory protection of CNNs as a solution to the current memory issues plaguing these systems and noted that this would offer the same protection guarantee while minimizing the financial costs of space. 

In a zero-space memory environment, Guan expressed that weight distribution-oriented training is used to regularize weight training, with no large values in the first seven bits in every 64 bits of data. Guan shared that the CNNs used in this approach converged without any accuracy loss, suggesting the efficacy of this memory protection. 

Following the presentation of her research projects, Gaun discussed her motivation for pursuing this type of research and future research interests, which include efficient CNN pruning, building reliable and robust machine learning algorithms, and uncovering the hidden reuses of machine learning programs.

“This is mainly because we care about performance... For machine learning problems, deep learning is the most time-consuming, and training one network is time-consuming, [so we must] target the most pressing issues that we can solve to bring a larger impact,” she said in an interview with The News-Letter.  

Guan also expressed her excitement about the current machine learning applications at Hopkins, such as those in bioinformatics, natural language understanding and vision-based robotics. Guan hopes that her research can be used to optimize machine learning systems and assist in reducing the time constraints of such technological systems.

“I would like to see how my research in reuse-optimization can actually help all those applications,” she said.


Have a tip or story idea?
Let us know!

Comments powered by Disqus

Please note All comments are eligible for publication in The News-Letter.

Podcast
Multimedia
Alumni Weekend 2024
Leisure Interactive Food Map
The News-Letter Print Locations
News-Letter Special Editions