What are tokens
Tokens in LLM are basically currency used by LLMs to identify patterns in an input, in order to generate an output, based on what the LLM has learend. For example, if a text like "big brown fox jumps over the lazy dog" is fed to a LLM model, and an an input is given to instruct LLM to complete a sentence, like "big brown fox jumps.., complete the sentence". Then, LLM will use 8 tokens to complete the sentence. So how are tokens calculated? Each word in the input is considered as one token, as it's recognized as an pattern by the LLM to complete the sentence. For example: "big brown fox jumps" is considered 4 tokens, to complete the sentence, LLM will use another 4 tokens. Then, it becomes "big brown fox jumps over the lazy dog" LLM will register parameters, and they will have the knowledge, but doesn't necessary means they will answer if you ask them, because based on how you ask them, they will give you the answer. A is mother of B Then...