Hey,
Everytime I try to scrape my stats on TumblingJazz it gives me the following error.
The black lines are the account names (all the accounts I wanted to scrape)
I encountered this problem once before, but that was because my internet connection wasn't working (properly).
But at this moment, my internet connection is very stable (I can download 5mb/s, and browse the web) but it still gives me the error.
I updated to the latest (stable) version of TumblingJazz.
submitted logs: 31311
Kind regards,
Nathan
Failing to scrape stats?
- martin@rootjazz
- Site Admin
- Posts: 35129
- Joined: Fri Jan 25, 2013 10:06 pm
- Location: The Funk
- Contact:
Re: Failing to scrape stats?
how many accounts were selected?
Doe the same happen with
1 account?
2 accounts?
Doe the same happen with
1 account?
2 accounts?
Re: Failing to scrape stats?
martin@rootjazz wrote:how many accounts were selected?
Doe the same happen with
1 account?
2 accounts?
Even happens when I have 1 or 2 accounts selected.
- martin@rootjazz
- Site Admin
- Posts: 35129
- Joined: Fri Jan 25, 2013 10:06 pm
- Location: The Funk
- Contact:
Re: Failing to scrape stats?
Please perform the following:
Open the program. Delete or pause all existing actions. If actions are already running due to autorun on startup, disable autorun at startup and restart the app. All actions must be waiting, then pause all (or delete)
Delete existing logs
HELP > LOGS > VIEW
delete all files in the folder that opens
Perform the action:
***
attempt to scrape the stats from one account (that fails)
***
PLEASE NOTE: Do not perform other actions at the same time. Only use one thread. Do not run any actions after this action.
Then submit logs:
HELP > LOGS > SUBMIT
and let me know the LOGS ID (just the number part is sufficient)
Regards,
Martin
Open the program. Delete or pause all existing actions. If actions are already running due to autorun on startup, disable autorun at startup and restart the app. All actions must be waiting, then pause all (or delete)
Delete existing logs
HELP > LOGS > VIEW
delete all files in the folder that opens
Perform the action:
***
attempt to scrape the stats from one account (that fails)
***
PLEASE NOTE: Do not perform other actions at the same time. Only use one thread. Do not run any actions after this action.
Then submit logs:
HELP > LOGS > SUBMIT
and let me know the LOGS ID (just the number part is sufficient)
Regards,
Martin
Re: Failing to scrape stats?
LOG ID: 17451
I took a look into the logs and I found out it's probably my proxies that don't work.
I don't understand why however, because on the Proxies providers website they seem to be active and working.
I took a look into the logs and I found out it's probably my proxies that don't work.
I don't understand why however, because on the Proxies providers website they seem to be active and working.
- martin@rootjazz
- Site Admin
- Posts: 35129
- Joined: Fri Jan 25, 2013 10:06 pm
- Location: The Funk
- Contact:
Re: Failing to scrape stats?
Not from my end, if your proxies are not working, or the accounts cannot log in. Then you need to make sure your proxies work and your accounts can login
12:44:17: loginbot
Exception: WebException
Kan geen verbinding met de externe server maken
12:44:17: * ERROR LOGIN: Kan geen verbinding met de externe server maken
12:44:17: * FAILED login - please try and generate login cookies manually via ACCOUNTS tab
inner exception: SocketException
inner: Er is geprobeerd een socketbewerking uit te voeren op een onbereikbare
Re: Failing to scrape stats?
Ugh, so apparently my IP-adres changed to an IPv6, and I had to authorise that IP-adres on my proxy providers site again.
Hope it will work now, takes a while to authorise the ip.
Hope it will work now, takes a while to authorise the ip.
Re: Failing to scrape stats?
It worked out.
Sorry for the inconvenience and thanks for your time.
Sorry for the inconvenience and thanks for your time.
- martin@rootjazz
- Site Admin
- Posts: 35129
- Joined: Fri Jan 25, 2013 10:06 pm
- Location: The Funk
- Contact:
Re: Failing to scrape stats?
no worries, glad it is sorted now