Friday Hacks #215: Reinforcement Learning and Linear Complexity Transformers

Posted on by Simon Julian Lauw

Date/Time: Friday, January 28 2022 19:00 +0800
Venue: Executive Classroom (COM2-04-02) & Online on Zoom (Hybrid)

YouTube Recording:

1) Getting Started with Reinforcement Learning - A Project-Based Approach

You may have heard of the successes of AI over humans in games such as Go, StarCraft and DoTa 2. The core technology behind these AI successes is reinforcement learning (RL). Other than video games, RL is also used in industry for recommendation, search and dynamic pricing. RL can seem quite intimidating to beginners due to the vastness of the field.

In this talk, Jet will share his personal journey in RL projects, and share some resources and advice for getting started with learning RL.

Speaker Profile

Jet New is a year 3 Computer Science undergraduate at NUS, focusing on artificial intelligence. He researches reinforcement learning at the NUS Collaborative, Learning and Adaptive Robotics Lab, and leads as president of the NUS Statistics and Data Science Society, organizing technical workshops and the annual NUS Data Science Competition. He previously interned at Grab and IMDA as a machine learning engineer.

2) Linear Complexity Transformers

Transformer-based language models, such as BERT and GPT, have made significant advances in the field of natural language processing, outperforming previously existing deep learning models. However, they suffer with the issue of quadratic complexity. This talk combines the techniques of linear complexity attention and sequence parallelism to (hopefully) enable handling longer input sequences than was previously possible using the original Transformer architecture.

Speaker Profile

Chaitanya is an NUS Hackers coreteam member, and CS undergrad who enjoys learning new stuff. He has yet to discover what part of technology truly interests him, but his technical interests lie in algorithms and distributed systems. He wishes to work on impactful tech stuff which will positively affect people’s lives.

comments powered by Disqus