LLM Engineering course : Day 2 - 5

Open Source Vs Closed Source Model

 

 

See below , canvas is used for Coding along side with ChatGPT

 

 

 

 

 

Please note gap between words may come up as part of token , another example below

 


Rule of thumb is a token maps to 4 characters

Context Window 

 

LLM appear to have a memory , what is really happening is everytime you talk to LLM / ChatGPT the entire conversation (including initial ques , its response, followup ques etc) are passed in again as a long prompt. To understand what is most likely to come next.

What is Context Window ?

https://www.ibm.com/think/topics/context-window

 

Comparing different context windows

https://medium.com/@genai.works/comprehensive-comparison-of-large-language-models-llms-0da7e894e419

  

Compare Models

 

  •  gemini has context window of 1million token = 750k words = entire work of shakespear

What is one shot prompting ?

In the prompt we send the model, we give one example of what we looking for, the kind of thing we expecting it to reply.

Multi shot prompting = tell what to respond in different situations 

 

 

 

 

Comments

Popular posts from this blog

Agentic AI Course : Week 1

LLM Engineering course : Day 1

LLM Engineering : Week 2