← Back to UniverseInferenceaiintermediateInference is a fancy term that just means using an ML model that has already been trained. In most contexts, it refers to using the model via API, although technically something like prompting an LLM is also inference.Related termsContext WindowTrainingChatGPT