Transformers for natural language processing: Build train and finetune deep neural network architectures for NLP with python hugging face and open AIs GPT-3 ChatGP and GPT-4. Dennis Rothmsn.
Material type: TextPublication details: Birmingham: Packt; c2022.Edition: 2rd edDescription: xxxiii, 565p.: ill.; 24cmISBN:- 9781803247335
- QA76.9 .N38 .R68 2022
Item type | Current library | Collection | Call number | Copy number | Status | Date due | Barcode | |
---|---|---|---|---|---|---|---|---|
Books | Zetech Library - Mang'u General Stacks | Non-fiction | QA76.9 .N38 .R68 2022 (Browse shelf(Opens below)) | C1 | Available | z011537 | ||
Books | Zetech Library - TRC General Stacks | Non-fiction | QA76.9 .N38 .R68 2022 (Browse shelf(Opens below)) | C2 | Available | Z011538 |
What are Transformers -- Getting Started with the Architecture of the Transformer Model -- Fine-Tuning BERT Models -- Pretraining a RoBERTa Model from Scratch -- Downstream NLP Tasks with Transformers -- Machine Translation with the Transformer -- The Rise of Suprahuman Transformers with GPT-3 Engines -- Applying Transformers to Legal and Financial Documents for AI Text Summarization -- Matching Tokenizers and Datasets -- Semantic Role Labeling with BERT-Based Transformers -- Let Your Data Do the Talking: Story, Questions, and Answers -- Detecting Customer Emotions to Make Predictions -- Analyzing Fake News with Transformers -- Interpreting Black Box Transformer Models -- From NLP to Task-Agnostic Transformer Models -- The Emergence of Transformer-Driven Copilots -- The Consolidation of Suprahuman Transformers with OpenAI’s ChatGPT and GPT-4.
There are no comments on this title.