Hi,
I like the twitterdub, but as it does have a lot of options and no explanation on each function its really a pain and also prevent you from using all his power as there is no info on most of the options.
anyway.. I hoop someone can anwser some of my questions.
1. Setting > max scrap result
Is that for each account or for all accounts combined? Is that limit applied to the scrap in scrap tab or also other tabs?
2. Is there a way to have the like tab watching a folder for new tweets to like?
3. Anyway that we can execute a command before running a process or after its done? For example restarting a 4G for a new IP to get?
4. When selecting multiple accounts to share across accounts when in like tab, is it possible to assign scraping part to only one account? So for example account 1 scrap all needed and then other accounts start liking once account 1 is finished getting all the tweets that need to be like?
5.Once you reactivate an account at twitter website after when a it gets IsBannedFromAction error, it doesn't get included anymore in process that are waiting anymore.... How to tell the process that he can use this account again?
thanks!
couple of questions...
Re: couple of questions...
Just to be clear for new people that read this, dont get me wrong, its a amazing program and very very powerful! if you are in this market, this is the best software that you can buy. You do need to spend sometimes to figure everything out but its worth it.
- martin@rootjazz
- Site Admin
- Posts: 34390
- Joined: Fri Jan 25, 2013 10:06 pm
- Location: The Funk
- Contact:
Re: couple of questions...
fairly sure replied to this already ??????
right click action row > EDIT ACTION > PRE RUN CMDS option
Or TOOLS > GLOBAL PRE RUN to run some code before each action
If you set 0 (unlimited for a search result, obviously the program cannot scrape unlimited, so this is a max cap, unless you specify to scrape more specificallyBabaksam wrote: ↑Sun May 22, 2022 7:26 pm Hi,
I like the twitterdub, but as it does have a lot of options and no explanation on each function its really a pain and also prevent you from using all his power as there is no info on most of the options.
anyway.. I hoop someone can anwser some of my questions.
1. Setting > max scrap result
per actionIs that for each account or for all accounts combined? Is that limit applied to the scrap in scrap tab or also other tabs?
MASS<actions> only have WATCH FOLDER functionality2. Is there a way to have the like tab watching a folder for new tweets to like?
per action:3. Anyway that we can execute a command before running a process or after its done? For example restarting a 4G for a new IP to get?
right click action row > EDIT ACTION > PRE RUN CMDS option
Or TOOLS > GLOBAL PRE RUN to run some code before each action
above accounts list, you should see SEARCH ACCS / FILTER ACCS4. When selecting multiple accounts to share across accounts when in like tab, is it possible to assign scraping part to only one account? So for example account 1 scrap all needed and then other accounts start liking once account 1 is finished getting all the tweets that need to be like?
right click > remove blocks or something like this. If you cannot find it, let me know and I can get a screenshot5.Once you reactivate an account at twitter website after when a it gets IsBannedFromAction error, it doesn't get included anymore in process that are waiting anymore.... How to tell the process that he can use this account again?
Re: couple of questions...
Hi Martin, Thanks. Yes you did already answer them by email, sorry I was just impatient
3 more and I am good
1)
=====================
I try to use the watch folder for mass like.... This is the format that I use in my txt file that get imported by watch folder cron. I assume it nog correct as I get error when process try to run it. I have check the form but the example is for insta and I dont understand exactly how the format should be.
num:4
pause:10:90
threads:1
accs:myaccount
https://twitter.com/BlossomMay11/status ... 0283863041
https://twitter.com/BlossomMay11/status ... 4168854529
https://twitter.com/BlossomMay11/status ... 7318153216
........
........
This the error:
Scraping Daily stats for: myaccount
Stats: Followers / followings: 102:0
Saving: 2022-05-25 20:06 0 102 91 0 3 0 0 0 8 0 0 25 to C:\Users\xxx\AppData\Roaming\rootjazz\Twitterdub\stats\bb1b0ac8e3d1424881f87114d3912f42.csv
Check not processed before: with: myaccount
Likes: with: myaccount Proxy: 192.168.1.9xxx:xxx
isid input is null or empty!: myaccount
Accounts Failed:
myaccount
0 successful : failed: 1 Likes
Started: 25-05-2022 20:06
Finished: 25-05-2022 20:06
ID: 600f9641-bd54-461b-bbdf-cb72e390254a
Action ran for: 0hr:0min:10s
I have also try like this but no luck either;
num:4
pause:10:90
threads:1
accs:hot_selexted
repeat_mins: 1
repeat_hours: 6
1529428460283863041
1529427374168854529
1529424747318153216
2)
=============================
I have also try the multi_input.txt methode but in that case it generate insane amount of process and I want everything in one process. Any way to join all of them in one process?
https://twitter.com/CandyGirl69x/status ... 4974339072, 1 ,30:90, myaccount
https://twitter.com/RenataA80105951/sta ... 4727263233, 1 ,30:90, myaccount
https://twitter.com/cosplayers_love/sta ... 1000461312, 1 ,30:90, myaccount
........
........
3)
=============================
When using search tab, output file saves all the tweet id's like "1529424747318153216"
So to use them later in watchfolder tab I need to add "https://twitter.com/CandyGirl69x/status/" to the begin of every line. Not a very big deal as I can create a .batch or a shell script to do the job but is there any beter way? I know I can use the like tab + search function but I prefer to have them seperates. Thats why I use the search tab first and then mass like to like the tweets.
Thanks again.
3 more and I am good
1)
=====================
I try to use the watch folder for mass like.... This is the format that I use in my txt file that get imported by watch folder cron. I assume it nog correct as I get error when process try to run it. I have check the form but the example is for insta and I dont understand exactly how the format should be.
num:4
pause:10:90
threads:1
accs:myaccount
https://twitter.com/BlossomMay11/status ... 0283863041
https://twitter.com/BlossomMay11/status ... 4168854529
https://twitter.com/BlossomMay11/status ... 7318153216
........
........
This the error:
Scraping Daily stats for: myaccount
Stats: Followers / followings: 102:0
Saving: 2022-05-25 20:06 0 102 91 0 3 0 0 0 8 0 0 25 to C:\Users\xxx\AppData\Roaming\rootjazz\Twitterdub\stats\bb1b0ac8e3d1424881f87114d3912f42.csv
Check not processed before: with: myaccount
Likes: with: myaccount Proxy: 192.168.1.9xxx:xxx
isid input is null or empty!: myaccount
Accounts Failed:
myaccount
0 successful : failed: 1 Likes
Started: 25-05-2022 20:06
Finished: 25-05-2022 20:06
ID: 600f9641-bd54-461b-bbdf-cb72e390254a
Action ran for: 0hr:0min:10s
I have also try like this but no luck either;
num:4
pause:10:90
threads:1
accs:hot_selexted
repeat_mins: 1
repeat_hours: 6
1529428460283863041
1529427374168854529
1529424747318153216
2)
=============================
I have also try the multi_input.txt methode but in that case it generate insane amount of process and I want everything in one process. Any way to join all of them in one process?
https://twitter.com/CandyGirl69x/status ... 4974339072, 1 ,30:90, myaccount
https://twitter.com/RenataA80105951/sta ... 4727263233, 1 ,30:90, myaccount
https://twitter.com/cosplayers_love/sta ... 1000461312, 1 ,30:90, myaccount
........
........
3)
=============================
When using search tab, output file saves all the tweet id's like "1529424747318153216"
So to use them later in watchfolder tab I need to add "https://twitter.com/CandyGirl69x/status/" to the begin of every line. Not a very big deal as I can create a .batch or a shell script to do the job but is there any beter way? I know I can use the like tab + search function but I prefer to have them seperates. Thats why I use the search tab first and then mass like to like the tweets.
Thanks again.
- martin@rootjazz
- Site Admin
- Posts: 34390
- Joined: Fri Jan 25, 2013 10:06 pm
- Location: The Funk
- Contact:
Re: couple of questions...
lol
3 more and I am good
num:4
pause:10:90
threads:1
accs:myaccount
https://twitter.com/BlossomMay11/status ... 0283863041
https://twitter.com/BlossomMay11/status ... 4168854529
https://twitter.com/BlossomMay11/status ... 7318153216
........
........
This the error:
Scraping Daily stats for: myaccount
Stats: Followers / followings: 102:0
Saving: 2022-05-25 20:06 0 102 91 0 3 0 0 0 8 0 0 25 to C:\Users\xxx\AppData\Roaming\rootjazz\Twitterdub\stats\bb1b0ac8e3d1424881f87114d3912f42.csv
Check not processed before: with: myaccount
Likes: with: myaccount Proxy: 192.168.1.9xxx:xxx
isid input is null or empty!: myaccount
fairly sure the file is formatted wrongly. Submit logs please, it should give more information
HELP > LOGS > SUBMIT
then send your logs ID - the numbers are sufficient (displayed after successful uploading of logs)
I'll also post up an example file after this reply so you can see what it should be
not sure why you posting the IDs at the end, you need to post a line with the ID / URL, as above, I'll post example after this reply
I have also try like this but no luck either;
num:4
pause:10:90
threads:1
accs:hot_selexted
repeat_mins: 1
repeat_hours: 6
1529428460283863041
1529427374168854529
1529424747318153216
tbh I have no memory of how multi_input works, I'll have to checkI have also try the multi_input.txt methode but in that case it generate insane amount of process and I want everything in one process. Any way to join all of them in one process?
https://twitter.com/CandyGirl69x/status ... 4974339072, 1 ,30:90, myaccount
https://twitter.com/RenataA80105951/sta ... 4727263233, 1 ,30:90, myaccount
https://twitter.com/cosplayers_love/sta ... 1000461312, 1 ,30:90, myaccount
........
When using search tab, output file saves all the tweet id's like "1529424747318153216"
So to use them later in watchfolder tab I need to add "https://twitter.com/CandyGirl69x/status/" to the begin of every line. Not a very big deal as I can create a .batch or a shell script to do the job but is there any beter way? I know I can use the like tab + search function but I prefer to have them seperates. Thats why I use the search tab first and then mass like to like the tweets.
on teh SEARCH tab is a section for OUTPUT TYPE, you must have selected IDs, change it to URLs
- martin@rootjazz
- Site Admin
- Posts: 34390
- Joined: Fri Jan 25, 2013 10:06 pm
- Location: The Funk
- Contact:
Re: couple of questions...
viewtopic.php?f=28&t=5692
you are missing the line
so you for this, where you just listed the URLs at the end of the file (why did you do this? did you read something to do it that way? Maybe the instructions / example are wrong somewhere, let me know as if you did it, others will as well
should be
then one file per url
works same with ID
doesn't matter URL / ID
Code: Select all
url:https://www.instagram.com/p/Be0f15ml6pO num:10 pause:1:9 threads:10 accs:username1,username2,username3....usernameN repeat_mins: 3 repeat_hours: 1
you are missing the line
Code: Select all
url: <input>
Code: Select all
num:4 pause:10:90 threads:1 accs:myaccount https://twitter.com/BlossomMay11/status ... 0283863041 https://twitter.com/BlossomMay11/status ... 4168854529 https://twitter.com/BlossomMay11/status ... 7318153216
should be
Code: Select all
num:4
pause:10:90
threads:1
accs:myaccount
url:https://twitter.com/BlossomMay11/statuses/0283863041
then one file per url
works same with ID
Code: Select all
num:4
pause:10:90
threads:1
accs:hot_selexted
repeat_mins: 1
repeat_hours: 6
url:1529428460283863041
doesn't matter URL / ID
- martin@rootjazz
- Site Admin
- Posts: 34390
- Joined: Fri Jan 25, 2013 10:06 pm
- Location: The Funk
- Contact:
Re: couple of questions...
back to the HELP URL (linked above, linked from app)
multi_input
re-reading what you put
If you are just trying to process a lot of items with a single account, that is not a MASS ACTION (like this one tweet with 100 accounts, or follow this 1 profile with 200 profiles - i.e. to boost stats).
Just perform a normal SEARCH ACTION
go to SEARCH LIKE tab
enter your file of inputs,
select from dropdown FROM FILE
select your account
run the action.
Now your 1 account will like all tweets in the file in a single action
Regards,
Martin
multi_input
Also available: multi line input format: (does not apply to mass comments)
The special file can be added to your watch folder with the filename
multi_input.txt
This allows you to add mass actions with the format:
Code: Select all
url, num ,min_pause:max_pause, accs
Code: Select all
https://www.instagram.com/234234234,10,1:9,all https://www.instagram.com/erwerwerw,10,1:9,user1,user2,user3,user4 https://www.instagram.com/4365345345,10,1:9,rand_20 https://www.instagram.com/hrhjtjtjtj,10,1:9,acctag_adult https://www.instagram.com/dfkdfjdkfjdkfj,10,1:9,user1
re-reading what you put
It's a MASS ACTION so one action per input. If you have 100 lines, it's one action per line.it generate insane amount of process and I want everything in one process. Any way to join all of them in one process?
If you are just trying to process a lot of items with a single account, that is not a MASS ACTION (like this one tweet with 100 accounts, or follow this 1 profile with 200 profiles - i.e. to boost stats).
Just perform a normal SEARCH ACTION
go to SEARCH LIKE tab
enter your file of inputs,
select from dropdown FROM FILE
select your account
run the action.
Now your 1 account will like all tweets in the file in a single action
Regards,
Martin