Artificial intelligence has taken the world by storm this year and now, with generative AI being hailed as a revolution for its ability to carry out complex conversations and generate original content. Now, Amazon.com Inc.
What Happened: During Amazon's June 2023 quarter earnings call, the company's CEO, Jassy elaborated on Amazon's AI efforts at a time when technology companies are seeing a sea change in their workflow evolution.
To be clear, Big Tech hasn't just started working on AI this year after ChatGPT burst into prominence. Jassy underlined this explicitly, as did Apple Inc.
In the case of Amazon, however, the initial work on AI is directed at businesses, not customers, which is why we don't see consumer-facing features aplenty yet.
"[We started] working several years ago on our custom AI chips for training called Trainium and inference called Inferentia that are on their second versions already," said Jassy.
Further elaborating on this, Jassy explained that there are three layers of generative AI, but most people end up focusing only on the one layer that is customer-centric.
"Generative AI has captured people's imagination, but most people are talking about the application layer, specifically what OpenAI has done with ChatGPT," Jassy explained.
Three Layers Of Generative AI
Lowest Layer: This is the layer developers target to develop generative AI applications. Think of it as developers requiring computational power to develop and process data sent and to by users.
Middle Layer: The middle layer is the myriad of large language models being offered as a service. Stable Diffusion, GPT-4, and Dall-E are some examples of these large language models. These LLMs are also offered as a service by developers targeting the lowest layer, with some services even offering an option to choose between any of these LLMs.
Top Layer: ChatGPT is an example of the top layer, which includes applications that run on top of these large language models. These applications have attracted most of the public's attention.