|
|
|
![]() |
Meet Our Girls | Register | FAQ | Members List | Calendar | Mark Forums Read |
Tutorials Step by step Guides and How to's with screengrabs. |
![]() |
|
Thread Tools | Display Modes |
![]() |
#1 |
Junior Member
Join Date: Aug 2016
Posts: 6
Thanks: 5
Thanked 9 Times in 5 Posts
![]() |
![]() http://online.pubhtml5.com/vfof/guzu/#p=1
So I've tried all the methods that I'm aware of for bulk download but none of them have worked. I had to use Chrome developer view to find the source files and then click and save one at a time. Using a bulk image downloader extension didn't work. It wasn't able to see any of the image files. Any ideas? |
![]() |
![]() |
|
![]() |
#2 |
Journeyman
Join Date: Jan 2008
Location: HH
Posts: 534
Thanks: 22,201
Thanked 6,595 Times in 522 Posts
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() In cases like this I use wget. A command line tool available for practically every operating systems.
Using bash it can be invoked in a for loop: HTML Code:
for x in $(seq 1 148); do wget http://online.pubhtml5.com/vfof/guzu/files/large/$x.jpg; done HTML Code:
wget http://online.pubhtml5.com/vfof/guzu/files/large/1.jpg wget http://online.pubhtml5.com/vfof/guzu/files/large/2.jpg wget http://online.pubhtml5.com/vfof/guzu/files/large/3.jpg .... wget http://online.pubhtml5.com/vfof/guzu/files/large/148.jpg Maybe there are also tools with a nice GUI out there to do this. Addendum: I forgot the -i param. You can create a text file containing urls and download them with HTML Code:
wget -i myurls.txt Last edited by halvar; 04-04-2018 at 04:41 PM.. Reason: forgot -i |
![]() |
![]() |
The Following 4 Users Say Thank You to halvar For This Useful Post: |
![]() |
#4 |
Journeyman
Join Date: Jan 2008
Location: HH
Posts: 534
Thanks: 22,201
Thanked 6,595 Times in 522 Posts
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() Mac is even easier, since you have a bash terminal and curl!
Run Terminal and type HTML Code:
curl --version If it is installed then execute HTML Code:
for x in $(seq 1 148); do curl -o $x.jpg http://online.pubhtml5.com/vfof/guzu/files/large/$x.jpg; done |
![]() |
![]() |
The Following 2 Users Say Thank You to halvar For This Useful Post: |
![]() |
#5 |
Junior Member
Join Date: Aug 2016
Posts: 6
Thanks: 5
Thanked 9 Times in 5 Posts
![]() |
![]() That worked great Halvar!
Just so i can educate myself, would you mind translating that code a bit. It looks like you are telling it, "whenever you see 'x' after $, write a sequential number starting at 1 and ending at 148." do curl -o (is this one command or is this 2 different sections? ie. do curl and -o) Then you use the URL but substitute the page number with "$x". I know some basic html/css programming but that's about the extent of my code knowledge. Would there be a way to do this if each image had a name and not a number? |
![]() |
![]() |
![]() |
#6 |
Journeyman
Join Date: Jan 2008
Location: HH
Posts: 534
Thanks: 22,201
Thanked 6,595 Times in 522 Posts
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() HTML Code:
for x in $(seq 1 148) do This is because Code:
seq 1 148 $x is a placeholder for the current value. Code:
curl -o $x.jpg http://online.pubhtml5.com/vfof/guzu/files/large/$x.jpg; The "-o $x.jpg" is an option meaning save this as 1.jpg, 2.jpg and so on. A more simple example printing the numbers 3 to 5: Code:
for foo in $(seq 3 5); do echo $foo; done; Code:
for foo in $(seq -w 3 10); do echo $foo; done; Code:
for v in foo bar "foo bar"; do echo ${v}; done; * Sometimes the placeholder has to be in curly braces ${v} Just toy around a bit to get the hang of it. I am not a bash scripting expert myself, but I often find it rather useful. Here is a very good documentation: http://tldp.org/LDP/Bash-Beginners-Guide/html/ |
![]() |
![]() |
The Following 2 Users Say Thank You to halvar For This Useful Post: |
![]() |
#7 |
Junior Member
Join Date: Aug 2016
Posts: 6
Thanks: 5
Thanked 9 Times in 5 Posts
![]() |
![]() That is super helpful Halvar. I appreciate it.
Do you have any ideas on how to pull the image files from these 2 sites. I couldn't find the source location of the images. http://magzus.com/read/penthouse_let...pril_2017_usa/ This one has an option for "reading online" which opens up a frame and allows you to flip through the pages. This other site I was able to find the source but it doesn't appear to be in sequential order and the file name structure seems to change. I'm not sure how to handle this one either. http://openbook.hbgusa.com/openbook/9781455531356 |
![]() |
![]() |
The Following User Says Thank You to saint825xtc For This Useful Post: |
![]() |
#8 | |
Journeyman
Join Date: Jan 2008
Location: HH
Posts: 534
Thanks: 22,201
Thanked 6,595 Times in 522 Posts
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
![]() Quote:
On the second only the first couple of pages are available without buying. And what is visible are not images but text. This does not surprise me, sites usually know how to protect their stuff. |
|
![]() |
![]() |
The Following 2 Users Say Thank You to halvar For This Useful Post: |
![]() |
Thread Tools | |
Display Modes | |
|
|