Prerequisites of in-Context Learning for Transformers on Queries
DOI:
https://doi.org/10.13053/cys-29-3-5884Keywords:
Data compressors, sentiment neurons, in-context learning, zero-shot learning, few-shot learningAbstract
Pre-trained generative transformers (GPT) with their 200+ billion parameters have already demonstrated their ability to successfully solve a wide range of text related problems without the need for additional taskspecific training. However, it has been observed that solution quality can be significantly improved for certain queries that reflect task formulation and conditions. It indicates that the transformer is further trained based on the query context, and the aim of this study is to show why GPT transformers enable to do it. To this end, the article jointly considers: elements of transformer architecture (data compressors and sentiment neurons), elements of the user interface with transformers (zero-shot and few-shot prompts), and text processing procedures (arithmetic coding and minimum description length). The authors attempt to provide a theoretical justification for the convergence of the sequential fine-tuning process using Hoeffding's inequality. The study presents experimental results demonstrating GPT transformers' capabilities for in-context learning. This confirms their potential for further development in natural language processing technologies.Downloads
Published
2025-09-25
Issue
Section
Articles
License
Hereby I transfer exclusively to the Journal "Computación y Sistemas", published by the Computing Research Center (CIC-IPN),the Copyright of the aforementioned paper. I also accept that these
rights will not be transferred to any other publication, in any other format, language or other existing means of developing.I certify that the paper has not been previously disclosed or simultaneously submitted to any other publication, and that it does not contain material whose publication would violate the Copyright or other proprietary rights of any person, company or institution. I certify that I have the permission from the institution or company where I work or study to publish this work.The representative author accepts the responsibility for the publicationof this paper on behalf of each and every one of the authors.
This transfer is subject to the following conditions:- The authors retain all ownership rights (such as patent rights) of this work, except for the publishing rights transferred to the CIC, through this document.
- Authors retain the right to publish the work in whole or in part in any book they are the authors or publishers. They can also make use of this work in conferences, courses, personal web pages, and so on.
- Authors may include working as part of his thesis, for non-profit distribution only.