Scrape Twitter - suggestion

Discussions to do with Soundcloud Manager. Do not use for support, use the dedicated support forum for help requests
User avatar
martin@rootjazz
Site Admin
Posts: 34359
Joined: Fri Jan 25, 2013 10:06 pm
Location: The Funk
Contact:

Re: Scrape Twitter - suggestion

Post by martin@rootjazz »

hmm *maybe* the user did enter that as their URL as your file shows only 2. Not sure why they would but......

That would imply everything is workign correctly.

The program scraped links
you extracted just links containing twitter (with some of those illegal due to the user entering them so.
Bartekef
Posts: 677
Joined: Thu Sep 22, 2016 12:24 pm

Re: Scrape Twitter - suggestion

Post by Bartekef »

martin@rootjazz wrote:hmm *maybe* the user did enter that as their URL as your file shows only 2. Not sure why they would but......

That would imply everything is workign correctly.

The program scraped links
you extracted just links containing twitter (with some of those illegal due to the user entering them so.
Yes correct. Now it works fine. Lol

I dont have that ' * SKIP: Exception of type 'LibUtil.ExSkipItem' was thrown. 'error anymore, I dont have any idea why it stopped to give me that error

Another question Martin - as I have them URLs, how can I change them to IDs ?
TwitterDub has option to extract URLs from IDs, but I can't find any way to do it other way

Could you please provide a solution for this? I would like to change every twitter url extracted using SCM, to twitter ID

Thanks!

EDIT: And maybe also feature to export only usernames? :)
User avatar
martin@rootjazz
Site Admin
Posts: 34359
Joined: Fri Jan 25, 2013 10:06 pm
Location: The Funk
Contact:

Re: Scrape Twitter - suggestion

Post by martin@rootjazz »

Bartekef wrote:
martin@rootjazz wrote:hmm *maybe* the user did enter that as their URL as your file shows only 2. Not sure why they would but......

That would imply everything is workign correctly.

The program scraped links
you extracted just links containing twitter (with some of those illegal due to the user entering them so.
Yes correct. Now it works fine. Lol

I dont have that ' * SKIP: Exception of type 'LibUtil.ExSkipItem' was thrown. 'error anymore, I dont have any idea why it stopped to give me that error
It isn't an error, just a log that shouldn't be logged. The program is stating IGNORE THIS ITEM and go back to a certain place in the code. However something between that line and code and where it should go detects it and logs it as an encompassing log

Another question Martin - as I have them URLs, how can I change them to IDs ?
you cannot. Well you can, not in the program though. But why do you want to? It requires one call per URL to get the ID

The program can work with URLs so there is no need. Basically it is extra work for me for zero benefit for you or me (fromwhat I can tell)
Bartekef
Posts: 677
Joined: Thu Sep 22, 2016 12:24 pm

Re: Scrape Twitter - suggestion

Post by Bartekef »

martin@rootjazz wrote:The program can work with URLs so there is no need. Basically it is extra work for me for zero benefit for you or me (fromwhat I can tell)

I was looking to make a promoted tweet targeting tailored audience, and I need IDs for that. Do you know any other way I could get IDs from URLs?
The other way is to have usernames instead of IDs
User avatar
martin@rootjazz
Site Admin
Posts: 34359
Joined: Fri Jan 25, 2013 10:06 pm
Location: The Funk
Contact:

Re: Scrape Twitter - suggestion

Post by martin@rootjazz »

try searching google, there may be a service to change urls to IDS

if you want usernames, just search and replace
Bartekef
Posts: 677
Joined: Thu Sep 22, 2016 12:24 pm

Re: Scrape Twitter - suggestion

Post by Bartekef »

martin@rootjazz wrote:try searching google, there may be a service to change urls to IDS

if you want usernames, just search and replace
unfortunately, I didnt find such service :(

what you mean search and replace? I have .txt files with dozens of URLs. Do you have any idea how, the quickest, I could change them to usernames?
User avatar
martin@rootjazz
Site Admin
Posts: 34359
Joined: Fri Jan 25, 2013 10:06 pm
Location: The Funk
Contact:

Re: Scrape Twitter - suggestion

Post by martin@rootjazz »

if you a list of

https://twitter.com/rootjazz
...



you can search: https://twitter.com/
replace with: <empty>


Not actually the string <empty> but replace it with nothing.

Notepad++ will do it for you if you don't have a text editor
Bartekef
Posts: 677
Joined: Thu Sep 22, 2016 12:24 pm

Re: Scrape Twitter - suggestion

Post by Bartekef »

martin@rootjazz wrote:if you a list of

https://twitter.com/rootjazz
...



you can search: https://twitter.com/
replace with: <empty>


Not actually the string <empty> but replace it with nothing.

Notepad++ will do it for you if you don't have a text editor
Thank you Martin. Im wondering, could you send me that Soundcloud Manager Guide in pdf?
User avatar
martin@rootjazz
Site Admin
Posts: 34359
Joined: Fri Jan 25, 2013 10:06 pm
Location: The Funk
Contact:

Re: Scrape Twitter - suggestion

Post by martin@rootjazz »

no, I don't have it in PDF format. You might be able to find website to pdf converters online though... maybe
Bartekef
Posts: 677
Joined: Thu Sep 22, 2016 12:24 pm

Re: Scrape Twitter - suggestion

Post by Bartekef »

martin@rootjazz wrote:no, I don't have it in PDF format. You might be able to find website to pdf converters online though... maybe
I mean that PDF document which was supposed to promote your product, SCM, which was an ebook which actually bringed me here.

Anyways.

I don't want to start new post as you continuously help me here. Could you give me an advice? Maybe you will know how to manage this.

I have a lot of twitter URLs scrapped (thanks to your software) and now im following them. Some of people follow me back, some don't. Some I could message directly, some not. Although, I can tweet to anyone.

Do you have an idea how can I:
1) send direct message to those who I can (this step is already easy with TwitterDUB - I will just send messages to my list in txt)
2) then, send a tweet to the rest of people from the same txt file, excluding those I sent DM before (to not duplicate)

And all that without using White list function, because I have thousands of URLs so putting them all to White list manually wouldn't be the most efficient way.

Thanks!
Post Reply