prompt_strategies.chat_template
prompt_strategies.chat_template
HF Chat Templates prompt strategy
Classes
Name | Description |
---|---|
ChatTemplatePrompter | Prompter for HF chat templates |
ChatTemplateStrategy | Tokenizing strategy for instruction-based prompts. |
MistralPrompter | Mistral prompter for chat template. |
MistralStrategy | Mistral strategy for chat template. |
StrategyLoader | Load chat template strategy based on configuration. |
ChatTemplatePrompter
prompt_strategies.chat_template.ChatTemplatePrompter(
tokenizer,
chat_template,=None,
processor=2048,
max_length=None,
message_property_mappings=None,
message_field_training=None,
message_field_training_detail='messages',
field_messages='system',
field_system='tools',
field_tools=None,
roles=None,
chat_template_kwargs=False,
drop_system_message )
Prompter for HF chat templates
Methods
Name | Description |
---|---|
build_prompt | Build a prompt from a conversation. |
build_prompt
prompt_strategies.chat_template.ChatTemplatePrompter.build_prompt(
conversation,=False,
add_generation_prompt=None,
images=None,
tools )
Build a prompt from a conversation.
Parameters
Name | Type | Description | Default |
---|---|---|---|
conversation | list[dict] | A list of messages. | required |
add_generation_prompt | Whether to add a generation prompt. | False |
|
images | A list of images. (optional) | None |
|
tools | A list of tools. (optional) | None |
ChatTemplateStrategy
prompt_strategies.chat_template.ChatTemplateStrategy(
prompter,
tokenizer,
train_on_inputs,
sequence_len,=None,
roles_to_train=None,
train_on_eos=None,
train_on_eot=None,
eot_tokens=False,
split_thinking )
Tokenizing strategy for instruction-based prompts.
Methods
Name | Description |
---|---|
find_first_eot_token | Find the first EOT token in the input_ids starting from start_idx. |
find_turn | Locate the starting and ending indices of the specified turn in a conversation. |
tokenize_prompt | Public method that can handle either a single prompt or a batch of prompts. |
find_first_eot_token
prompt_strategies.chat_template.ChatTemplateStrategy.find_first_eot_token(
input_ids,
start_idx, )
Find the first EOT token in the input_ids starting from start_idx.
find_turn
prompt_strategies.chat_template.ChatTemplateStrategy.find_turn(
turns,
turn_idx,=None,
tools )
Locate the starting and ending indices of the specified turn in a conversation.
tokenize_prompt
prompt_strategies.chat_template.ChatTemplateStrategy.tokenize_prompt(prompt)
Public method that can handle either a single prompt or a batch of prompts.
MistralPrompter
*args, **kwargs) prompt_strategies.chat_template.MistralPrompter(
Mistral prompter for chat template.
MistralStrategy
prompt_strategies.chat_template.MistralStrategy(
prompter,
tokenizer,
train_on_inputs,
sequence_len,=None,
roles_to_train=None,
train_on_eos=None,
train_on_eot=None,
eot_tokens=False,
split_thinking )
Mistral strategy for chat template.
Attributes
Name | Description |
---|---|
supports_multiprocessing | Whether this tokenizing strategy supports multiprocessing. |
Methods
Name | Description |
---|---|
find_first_eot_token | Find the first EOT token in the input_ids starting from start_idx. |
find_first_eot_token
prompt_strategies.chat_template.MistralStrategy.find_first_eot_token(
input_ids,
start_idx, )
Find the first EOT token in the input_ids starting from start_idx.
StrategyLoader
prompt_strategies.chat_template.StrategyLoader()
Load chat template strategy based on configuration.