Book Content
chapters • 20h4m total length
1. What are Transformers?
2. Getting Started with the Architecture of the Transformer Model
3. Fine-Tuning BERT Models
4. Pretraining a RoBERTa Model from Scratch
5. Downstream NLP Tasks with Transformers
6. Machine Translation with the Transformer
7. The Rise of Suprahuman Transformers with GPT-3 Engines
8. Applying Transformers to Legal and Financial Documents for AI Text Summarization
9. Matching Tokenizers and Datasets
10. Semantic Role Labeling with BERT-Based Transformers
11. Let Your Data Do the Talking: Story, Questions, and Answers
12. Detecting Customer Emotions to Make Predictions
13. Analyzing Fake News with Transformers
14. Interpreting Black Box Transformer Models
15. From NLP to Task-Agnostic Transformer Models
16. The Emergence of Transformer-Driven Copilots
17. The Consolidation of Suprahuman Transformers with OpenAI’s ChatGPT and GPT-4'
18. Appendix I — Terminology of Transformer Models
19. Appendix II — Hardware Constraints for Transformer Models
20. Appendix III — Generic Text Completion with GPT-2
21. Appendix IV — Custom Text Completion with GPT-2
22. Appendix V — Answers to the Questions














