+ 1

How to crawl multiple URLs in php

Here is the piece of the code that I wrote and it worked for a single URL like this: $startUrl = 'http://www.whatever.com'; Links($startUrl); ?> It works for multiple lines as well: $startUrl = 'http://www.whatever1.com'; 'http://www.whatever2.com'; 'http://www.whatever3.com'; 'http://www.whatever4.com'; Links($startUrl); ?> but I am trying to have all the URLs in a text file not in the code $startUrl = $myfile = $file = fopen("assets/images/pages.txt","r"); while(! feof($file)) { echo fgets($file). "<br />"; } fclose($file); followLinks($startUrl); ?> This is what I've done. It is almost there but it doesn't work. What I am missing? What should I add or change so that code can crawl through all the URLs that I have in the text file. Now the way it is set up it just shows the list of URLs in the browser. Thank you for your time!

9th Jun 2020, 2:11 AM
victor batch
1 Answer
0
Well if your file is being read as one big string, you could use explode() in php to dissect it into an array of strings, by specifying a seperator (in this case ';'), then you would loop on the array elements (the urls) and get their content one by one
27th Jun 2020, 10:43 AM
Dominique Abou Samah
Dominique Abou Samah - avatar