Free Download Newline – Responsive LLM Applications with Server-Sent Events
Released 10/2024
MP4 | Video: h264, 1920×1080 | Audio: AAC, 44.1 KHz, 2 Ch
Genre: eLearning | Language: English | Duration: 20 Lessons ( 1h 18m ) | Size: 372 MB
Dive into Retrieval Augmented Generation and Autonomous Agents with LangChain, Chroma and FastAPI
Large Language Models are reshaping industries, yet integrating them into real-time streaming UIs presents unique challenges. In this course we will learn how to seamlessly integrate LLM APIs into applications and build AI-powered streaming text and chat UIs with TypeScript, React, and Python. Step-by-step, we will build a full-stack AI application with quality code and very flexible implementation.
The LLM application in this course includes
Completion Use-Case (english to emojis)
Chat
Retrieval Augmented Generation use-case
AI Agent Use-Cases (code execution, data-Analyste agent)
This app can be used as a starting point in most projects, saving a huge amount of time, and its flexibilty allows new tools to be added as needed.
At the end of this course, you will have mastered end-to-end implementation of a flexible and high-quality LLM application. This course will also equip you with the knowledge and skills necessary to create sophisticated LLM solutions of your own.
What you will learn
How to design systems for AI applications
How to stream the answer of a Large Language Model
Differences between Server-Sent Events and WebSockets
Importance of real-time for GenAI UI
How asynchronous programming in Python works
How to integrate LangChain with FastAPI
What problems Retrieval Augmented Generation can solve
How to create an AI agent
Homepage
www.newline.co/courses/responsive-llm-applications-with-server-sent-events
Leave a Reply
You must be logged in to post a comment.