how to restart kernal in python
Import os
os.exit(00)
Langchain core components
Open source interface that simplifies app development using LLM
Documents,
chains ,
agents ,
language model,
chat model ,
chat message,
prompt template and
output parsers
Language model
Language model in langchain is the Foundation of LLMs. It uses text input to generate text outputs , and hep to compete tasks and summarize documents.
Langchain uses IBM, OpenAI, google and meta as primary language models
Chat model and language model
A chat model is designed for efficient conversations. It means it understands the questions or prompts , and respond like a human being . next to generate response , create a language model using IBM/Open AI and transform the model into chat model using LLM function . This converts chat model into conversational LLM to engage in dialogues
Chat messages
Chat Model creates list of chat messages - system, human and AI
Prompt templates
Prompt templates translates user questions or messages into clear instructions.
String prompt templates - useful for single string formatting
Chat prompt templates - useful for message lists
Specific message prompt templates ( AI MPT, Human MPT, System MPT and chat MPT) allows for flexible role assignments
Few shot prompt template
Semantic similarity, max margin relevance for diversity, examples of Efficient prompts, and N gram overlap for textual similarity
Output parsers
Ouput parsers allows converting / transforms output to desired output in specific layout (json, xml, csv, pandas)
LLM chain
Overall chain
Template string —> create prompt template —> create LLM chain object
from langchain.chains import LLMChain,SequentialChain
template = “””” suggest dish {location} “””
prompt_template= PromptTemplate (template =template, input_variables =[‘location’])
chain_loc = LLMChain( LLM=mistral_llm, prompt =prompt_template , output_key=‘meal’)
Create similar chain #2 where input is key for second template
Overall_chain = SequentialChain(chains=[chain_loc, chain_2],Input_variable =[‘location’],
Output_variables=[‘meal’,’chain2output’], verbose=True)
Overall_chain.invoke(input={‘location’:’china’})