28 releases (9 breaking)

new 0.10.1 Jul 20, 2024
0.9.0 Jul 3, 2024
0.5.3 Mar 25, 2024
0.3.0 Nov 28, 2023

#203 in Data structures

Download history 157/week @ 2024-03-25 179/week @ 2024-04-01 359/week @ 2024-04-08 161/week @ 2024-04-15 494/week @ 2024-04-22 235/week @ 2024-04-29 157/week @ 2024-05-06 80/week @ 2024-05-13 169/week @ 2024-05-20 16/week @ 2024-05-27 39/week @ 2024-06-03 29/week @ 2024-06-10 2/week @ 2024-06-17 208/week @ 2024-06-24 597/week @ 2024-07-01 155/week @ 2024-07-08

966 downloads per month
Used in llama-core

Apache-2.0

325KB
6.5K SLoC

Prompt Templates for LLMs

chat-prompts is part of LlamaEdge API Server project. It provides a collection of prompt templates that are used to generate prompts for the LLMs (See models in huggingface.co/second-state).

Prompt Templates

The available prompt templates are listed below:

  • baichuan-2

  • codellama-instruct

    • Prompt string

      <s>[INST] <<SYS>>
      Write code to solve the following coding problem that obeys the constraints and passes the example test cases. Please wrap your code answer using ```: <</SYS>>
      
      {prompt} [/INST]
      
    • Example: second-state/CodeLlama-13B-Instruct-GGUF

  • codellama-super-instruct

    • Prompt string

      <s>Source: system\n\n {system_prompt} <step> Source: user\n\n {user_message_1} <step> Source: assistant\n\n {ai_message_1} <step> Source: user\n\n {user_message_2} <step> Source: assistant\nDestination: user\n\n
      
    • Example: second-state/CodeLlama-70b-Instruct-hf-GGUF

  • chatml

    • Prompt string

      <|im_start|>system
      {system_message}<|im_end|>
      <|im_start|>user
      {prompt}<|im_end|>
      <|im_start|>assistant
      
    • Example: second-state/Yi-34B-Chat-GGUF

  • chatml-tool

    • Prompt string

      <|im_start|>system\n{system_message} Here are the available tools: <tools> [{tool_1}, {tool_2}] </tools> Use the following pydantic model json schema for each tool call you will make: {"properties": {"arguments": {"title": "Arguments", "type": "object"}, "name": {"title": "Name", "type": "string"}}, "required": ["arguments", "name"], "title": "FunctionCall", "type": "object"} For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:\n<tool_call>\n{"arguments": <args-dict>, "name": <function-name>}\n</tool_call><|im_end|>
      <|im_start|>user
      {user_message}<|im_end|>
      <|im_start|>assistant
      
      • Example

        <|im_start|>system\nYou are a function calling AI model. You are provided with function signatures within <tools></tools> XML tags. You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions. Here are the available tools: <tools> [{"type":"function","function":{"name":"get_current_weather","description":"Get the current weather in a given location","parameters":{"type":"object","properties":{"location":{"type":"string","description":"The city and state, e.g. San Francisco, CA"},"format":{"type":"string","description":"The temperature unit to use. Infer this from the users location.","enum":["celsius","fahrenheit"]}},"required":["location","format"]}}},{"type":"function","function":{"name":"predict_weather","description":"Predict the weather in 24 hours","parameters":{"type":"object","properties":{"location":{"type":"string","description":"The city and state, e.g. San Francisco, CA"},"format":{"type":"string","description":"The temperature unit to use. Infer this from the users location.","enum":["celsius","fahrenheit"]}},"required":["location","format"]}}}] </tools> Use the following pydantic model json schema for each tool call you will make: {"properties": {"arguments": {"title": "Arguments", "type": "object"}, "name": {"title": "Name", "type": "string"}}, "required": ["arguments", "name"], "title": "FunctionCall", "type": "object"} For each function call return a json object with function name and arguments within <tool_call></tool_call> XML tags as follows:\n<tool_call>\n{"arguments": <args-dict>, "name": <function-name>}\n</tool_call><|im_end|>
        <|im_start|>user
        Hey! What is the weather like in Beijing?<|im_end|>
        <|im_start|>assistant
        
    • Example: second-state/Hermes-2-Pro-Llama-3-8B-GGUF

  • deepseek-chat

  • deepseek-chat-2

  • deepseek-coder

  • embedding

  • gemma-instruct

    • Prompt string

      <bos><start_of_turn>user
      {user_message}<end_of_turn>
      <start_of_turn>model
      {model_message}<end_of_turn>model
      
    • Example: second-state/gemma-2-27b-it-GGUF

  • glm-4-chat

  • human-assistant

  • intel-neural

  • llama-2-chat

    • Prompt string

      <s>[INST] <<SYS>>
      {system_message}
      <</SYS>>
      
      {user_message_1} [/INST] {assistant_message} </s><s>[INST] {user_message_2} [/INST]
      
  • llama-3-chat

    • Prompt string

      <|begin_of_text|><|start_header_id|>system<|end_header_id|>
      
      {{ system_prompt }}<|eot_id|><|start_header_id|>user<|end_header_id|>
      
      {{ user_message_1 }}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
      
      {{ model_answer_1 }}<|eot_id|><|start_header_id|>user<|end_header_id|>
      
      {{ user_message_2 }}<|eot_id|><|start_header_id|>assistant<|end_header_id|>
      
  • mistral-instruct

  • mistrallite

  • mistral-tool

    • Prompt string

      [INST] {user_message_1} [/INST][TOOL_CALLS] [{tool_call_1}]</s>[TOOL_RESULTS]{tool_result_1}[/TOOL_RESULTS]{assistant_message_1}</s>[AVAILABLE_TOOLS] [{tool_1},{tool_2}][/AVAILABLE_TOOLS][INST] {user_message_2} [/INST]
      
      • Example

        [INST] Hey! What is the weather like in Beijing and Tokyo? [/INST][TOOL_CALLS] [{"name":"get_current_weather","arguments":{"location": "Beijing, CN", "format": "celsius"}}]</s>[TOOL_RESULTS]Fine, with a chance of showers.[/TOOL_RESULTS]Today in Auckland, the weather is expected to be partly cloudy with a high chance of showers. Be prepared for possible rain and carry an umbrella if you're venturing outside. Have a great day!</s>[AVAILABLE_TOOLS] [{"type":"function","function":{"name":"get_current_weather","description":"Get the current weather in a given location","parameters":{"type":"object","properties":{"location":{"type":"string","description":"The city and state, e.g. San Francisco, CA"},"unit":{"type":"string","enum":["celsius","fahrenheit"]}},"required":["location"]}}},{"type":"function","function":{"name":"predict_weather","description":"Predict the weather in 24 hours","parameters":{"type":"object","properties":{"location":{"type":"string","description":"The city and state, e.g. San Francisco, CA"},"unit":{"type":"string","enum":["celsius","fahrenheit"]}},"required":["location"]}}}][/AVAILABLE_TOOLS][INST] What is the weather like in Beijing now?[/INST]
        
    • Example: second-state/Mistral-7B-Instruct-v0.3-GGUF

  • octopus

  • openchat

  • phi-2-instruct

  • phi-3-chat

    • Prompt string

      <|system|>
      {system_message}<|end|>
      <|user|>
      {user_message_1}<|end|>
      <|assistant|>
      {assistant_message_1}<|end|>
      <|user|>
      {user_message_2}<|end|>
      <|assistant|>
      
    • Example: second-state/Phi-3-medium-4k-instruct-GGUF

  • solar-instruct

  • stablelm-zephyr

  • vicuna-1.0-chat

  • vicuna-1.1-chat

  • vicuna-llava

  • wizard-coder

  • zephyr

Dependencies

~16MB
~167K SLoC