[SOLVED] Stops writing collected urls

ZeroCouponBond

Новичок
Регистрация
25.09.2012
Сообщения
12
Благодарностей
0
Баллы
0
Hi, i try to scrape search pages. But when I run it in ZennoPoster, it works for the first 45-60 minutes, but then it stops writing to the text-file, but still going trough each page.

It works like this:
Start -->
Set counter 0 -->
Clear Cookies -->
Get line from list (textfile with alot of urls) -->
Go to line -->
Scrape new urls on webpage (its 50 urls on each page that i want) and save it to the list "List1" that is connected to a txt-file -->
Counter +1 -->
Turn-off switch on maximum count -->
Random pause 5-10 sek -->
Cleaning cookies and then on........

 

Hungry Bulldozer

Moderator
Регистрация
12.01.2011
Сообщения
3 441
Благодарностей
834
Баллы
113
You should remove loop from the project and execute it in ZennoPoster with as many iterations as want to be in your loop. This way it will have opportunity to free up memory.
 
  • Спасибо
Реакции: ZeroCouponBond

ZeroCouponBond

Новичок
Регистрация
25.09.2012
Сообщения
12
Благодарностей
0
Баллы
0
You should remove loop from the project and execute it in ZennoPoster with as many iterations as want to be in your loop. This way it will have opportunity to free up memory.
Thank you very much, I will try that.

I have been running my setup in debug-mode in ProjectMaker for 25 hours now (finished in 3 hours), but I will prefer to do this in ZennoPoster MP instead. I´ll try your setup without loop tomorrow and update you on the outcome :-)
 

Hungry Bulldozer

Moderator
Регистрация
12.01.2011
Сообщения
3 441
Благодарностей
834
Баллы
113
I'm sure it is gonna do the trick
 

ZeroCouponBond

Новичок
Регистрация
25.09.2012
Сообщения
12
Благодарностей
0
Баллы
0
Hi again, I was trying to this now, but I have a little problem.

Is it possible to link the counter to the execution count in zennoposter mp? So that the counter is increasing by one on each execution count.

Because I want it to go through every single one of the urls in the list and not just alot of random ones.
 

ZeroCouponBond

Новичок
Регистрация
25.09.2012
Сообщения
12
Благодарностей
0
Баллы
0
Solution: Just choose first line and then delete line from list of URLs (this is chosen in the Get line-square). I didnt see that option earlier. It works justas I wanted now. Thanks :-)
 

Кто просматривает тему: (Всего: 1, Пользователи: 0, Гости: 1)