Vithursan Thangarasa bio photo

Vithursan Thangarasa

Originally from Toronto, Canada, and currently based in the San Francisco Bay Area, I am deeply passionate about neural network compression, large-scale foundation models, and enhancing the efficiency of training large neural networks, with a keen interest in generative AI.

Twitter   Google Scholar LinkedIn Github E-Mail

On March 20, 2018, I gave a talk to the Machine Learning Research Group (MLRG) at the University of Guelph on continual learning, also called lifelong learning, which has been a hot research topic in recent years. Here, I reviewed a super interesting NIPS 2017 paper called Gradient Episodic Memory for Continual Learning [1] from Facebook AI Research (FAIR).


You can download my Google Slides in PDF.


Frame


More details coming soon…


References
[1] Lopez-Paz, D. and Ranzato, M. Gradient episodic memory for continual learning. In Advances in Neural Information Processing Systems (NIPS) 30, pp. 6467–6476. 2017.