LLM Engineering course : Day 2 - 5
Open Source Vs Closed Source Model See below , canvas is used for Coding along side with ChatGPT Please note gap between words may come up as part of token , another example below Rule of thumb is a token maps to 4 characters Context Window LLM appear to have a memory , what is really happening is everytime you talk to LLM / ChatGPT the entire conversation (including initial ques , its response, followup ques etc) are passed in again as a long prompt. To understand what is most likely to come next. What is Context Window ? https://www.ibm.com/think/topics/context-window Comparing different context windows https://medium.com/@genai.works/comprehensive-comparison-of-large-language-models-llms-0da7e894e419 Compare Models https://www.vellum.ai/llm-leaderboard https://www.youtube.com/watch?v=C8ftIfg6ROs&t=390s context window, cost per million input token, cost per million output window ...