Skip to content

sinanuozdemir/oreilly-hands-on-transformers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

oreilly-logo

Hands on NLP with Transformers

This repository contains code for the O'Reilly Live Online Training for Hands on NLP with Transformers

This training will provide an introduction to the novel transformer architecture which is currently considered state of the art for modern NLP tasks. We will take a deep dive into what makes the transformer unique in its ability to process natural language including attention and encoder-decoder architectures. We will see several examples of how people and companies are using transformers to solve a wide variety of NLP tasks including conversation-holding, image captioning, reading comprehension, and more.

This training will feature several code-driven examples of transformer-derived architectures including BERT, GPT, T5, and the Vision Transformer. Each of our case studies will be inspired by real use-cases and will lean on transfer learning to expedite our process while using actionable metrics to drive results.

Notebooks

  1. BERT - the beginnings of LLMs
  2. XLNET - moving auto-encoding models further
  3. T5 - the beginnings of instructional alignment
  4. GPT - How LLMs learned to talk
  5. Multimodal LLMs

Instructor

Sinan Ozdemir is founder and CTO of LoopGenius, where he uses state-of-the-art AI to help people create and run their businesses. He has lectured in data science at Johns Hopkins University and authored multiple books, videos and numerous online courses on data science, machine learning, and generative AI. He also founded the recently acquired Kylie.ai, an enterprise-grade conversational AI platform with RPA capabilities. Sinan most recently published Quick Guide to Large Language Models, and launched a podcast audio series, AI Unveiled. Ozdemir holds a master’s degree in pure mathematics from Johns Hopkins University.