000 | 01503nam a22001697a 4500 | ||
---|---|---|---|
008 | 240203b |||||||| |||| 00| 0 eng d | ||
020 | _a9781803247335 | ||
040 | _cZET-ke | ||
050 |
_aQA76.9 .N38 _b .R68 2022 |
||
100 | _aRothman Denis. | ||
245 |
_aTransformers for natural language processing: _bBuild train and finetune deep neural network architectures for NLP with python hugging face and open AIs GPT-3 ChatGP and GPT-4. _cDennis Rothmsn. |
||
250 | _a2rd ed. | ||
260 |
_aBirmingham: _bPackt; _cc2022. |
||
300 |
_axxxiii, 565p.: _bill.; _c24cm. |
||
505 | _aWhat are Transformers -- Getting Started with the Architecture of the Transformer Model -- Fine-Tuning BERT Models -- Pretraining a RoBERTa Model from Scratch -- Downstream NLP Tasks with Transformers -- Machine Translation with the Transformer -- The Rise of Suprahuman Transformers with GPT-3 Engines -- Applying Transformers to Legal and Financial Documents for AI Text Summarization -- Matching Tokenizers and Datasets -- Semantic Role Labeling with BERT-Based Transformers -- Let Your Data Do the Talking: Story, Questions, and Answers -- Detecting Customer Emotions to Make Predictions -- Analyzing Fake News with Transformers -- Interpreting Black Box Transformer Models -- From NLP to Task-Agnostic Transformer Models -- The Emergence of Transformer-Driven Copilots -- The Consolidation of Suprahuman Transformers with OpenAI’s ChatGPT and GPT-4. | ||
942 |
_2lcc _cBK _hQA76.9 .N38 _kQA76.9 .N38 _m .R68 2022 |
||
999 |
_c5919 _d5919 |