Association for Computing Machinery at East Carolina University


Home Announcements Apply Contact


Kurt Schmidt from NVidia

on Jan. 30, 2018, 3:30 p.m.

On Tuesday, Jan 30 at 3:30pm, Kurt Schmidt from NVidia will discuss topics such as machine learning, deep neural networks, and GPU computing in Sci. & Tech 144A.

Data scientists in both industry and academia have been using GPUs for AI and machine learning to make groundbreaking improvements across a variety of applications including image classification, video analytics, speech recognition and natural language processing. In particular, Deep Learning – the use of sophisticated, multi-level “deep” neural networks to create systems that can perform feature detection from massive amounts of unlabeled training data – is an area that has been seeing significant investment and research.

Although AI has been around for decades, two relatively recent trends have sparked widespread use of Deep Learning within AI: the availability of massive amounts of training data, and powerful and efficient parallel computing provided by GPU computing. Early adopters of GPU accelerators for machine learning include many of the largest web and social media companies, along with top tier research institutions in data science and machine learning. With thousands of computational cores and 10-100x application throughput compared to CPUs alone, GPUs have become the processor of choice for processing big data for data scientists.




acm ecu