In KDnuggets, Denis Shipilov, Solutions Architect at DataArt, explains ChatGPT and its related terms, such as ML, AI, AGI, neural networks, deep learning, large language models, and highlights the differences between GPT-3 and ChatGPT, among other topics.
“Everyone seems to have gone crazy about ChatGPT, which has become a cultural phenomenon. If you’re not on the ChatGPT train yet, this article might help you better understand the context and excitement around this innovation.”
“OpenAI experimented with transformers to build a neural probabilistic language model. The results of their experiments are called GPT (generative pre-trained transformer) models. Pre-trained means they were training the transformer NN on a large body of texts mined on the Internet and then taking its decoder part for language representation and text generation.”
“Given that LLMs are just sophisticated statistical machines, the generation process could go in an unexpected and unpleasant direction. This type of result is sometimes called an AI hallucination, but from the algorithmic perspective, it is still valid, though unexpected, by human users.”
The original article can be found here.

