chatGPT for Tweet replies / base tweets / DMs
Posted: Wed Apr 05, 2023 10:32 pm
TOOLS > SETTINGS > CHATGPT > API
Set your API key
TOOLS > SETTINGS > CHATGPT > MODEL
Set your model, otherwise default is: ext-davinci-003
TOOLS > SETTINGS > CHATGPT > Prompt
Set your Prompt, otherwise default is: reply to this comment
TOOLS > SETTINGS > CHATGPT > Max Tokens
Set your Max Tokens, otherwise default is: 64
TOOLS > SETTINGS > CHATGPT > top_p
Set your top_p, otherwise default is: 1
TOOLS > SETTINGS > CHATGPT > temperature
Set your temperature, otherwise default is: 1
TOOLS > SETTINGS > CHATGPT > Custom URL
Set your custom URL otherwise default is: https://api.openai.com/v1/completions
TOOLS > SETTINGS > CHATGPT > JSON payloa
Set your custom JSON payload with tokens:
example::
Then in your reply text use the token:
The tweet you are replying to, has it's text sent to chatgpt and the response is replaced in your text replacing the token #chatgpt#
If you want to use a custom model and or custom API or custom prompt per reply set these via tokens
Note :#chatgptjson_<path_to_template_file># allows you to specify the template file that will contain the JSON payload, it's not the payload itself
where:
<max_tokens> is a number - default is 64
You will still need the #chatgpt# token which is used as the replacement location
So an example twitterdub reply would be using custom data
If you are using ChatGPT for base tweets, or DMs, then it's reliant on you to have a good quality global or custom prompt to allow chatGPT to come up with the text.
Custom JSON template file
example file
Regards,
Martin
Set your API key
TOOLS > SETTINGS > CHATGPT > MODEL
Set your model, otherwise default is: ext-davinci-003
TOOLS > SETTINGS > CHATGPT > Prompt
Set your Prompt, otherwise default is: reply to this comment
TOOLS > SETTINGS > CHATGPT > Max Tokens
Set your Max Tokens, otherwise default is: 64
TOOLS > SETTINGS > CHATGPT > top_p
Set your top_p, otherwise default is: 1
TOOLS > SETTINGS > CHATGPT > temperature
Set your temperature, otherwise default is: 1
TOOLS > SETTINGS > CHATGPT > Custom URL
Set your custom URL otherwise default is: https://api.openai.com/v1/completions
TOOLS > SETTINGS > CHATGPT > JSON payloa
Set your custom JSON payload with tokens:
Code: Select all
#maxtokens#
#top_p#
#temperature#
#model#
#body#
Code: Select all
{"top_p":"#top_p#,"temperature":#temperature#,"max_tokens":#maxtokens#,"model":"#model#","prompt":"#body#"}
Then in your reply text use the token:
Code: Select all
#chatgpt#
If you want to use a custom model and or custom API or custom prompt per reply set these via tokens
Code: Select all
#chatgpttemperature_<temperature>#
#chatgpturl_<url>#
#chatgpttop_p_<top_p>#
#chatgptapi_<apikey>#
#chatgptmaxtokens_<max_tokens>#
#chatgptprompt_reply to this comment like a teenage girl who uses lots of emojis and hashtags#
#chatgptjson_<path_to_template_file>#
where:
<max_tokens> is a number - default is 64
You will still need the #chatgpt# token which is used as the replacement location
So an example twitterdub reply would be using custom data
#chatgpttemperature_0.5# #chatgptmodel_rootjazzmodel# #chatgptprompt_reply to this comment like a grumpy programmer# #chatgpt# lol {:-)|:-D|;-)} #date#
#chatgpttop_p_0.5# #chatgptprompt_reply to this comment like a teenage girl with lots of emojis# #chatgpt# lol {:-)|:-D|;-)} #date#
If you are using ChatGPT for base tweets, or DMs, then it's reliant on you to have a good quality global or custom prompt to allow chatGPT to come up with the text.
Custom JSON template file
Where the file:c:\path\to\templates.txt is an actual file on your machine with valid JSON post payload template#chatgptjson_c:\path\to\templates.txt# using a custom payload
example file
Code: Select all
{"top_p":"#top_p#,"temperature":#temperature#,"max_tokens":#maxtokens#,"model":"#model#","prompt":"#body#"}
Regards,
Martin