Auditing ChatGPT – part I
Capgemini
APRIL 3, 2024
It achieved spectacular performance for sequential-data tasks, such as text or temporal data. By using a specific technology called ‘attention mechanism’, published in 2015, the Transformer model pushed the limits of previous models, particularly the length of texts processed and/or generated. .”
Let's personalize your content