ChatGPT is based on transformative architecture, a type of neural network that was first introduced in the article “Attention is all you need” by Vaswani et al. The transformer architecture allows parallel processing, making it suitable for processing data sequences, such as text. ChatGPT is a direct descendant of GPT-3 and is a sophisticated form of a sophisticated machine learning algorithm called a neural network. In essence, Chat GPT is the implementation of a type of neural network known as a transformer.
Transformers are a type of deep learning algorithm that is commonly used in the field of natural language processing (NLP). They are designed to process data sequences, such as text, and learn patterns in that data that can be used for various NLP tasks, such as generating text or answering questions. On the other hand, if ChatGPT reproduces novel text from the conversation, then, ipso facto, ChatGPT is not a neural network.