- Add abort on error option for batch download as requested by Xemnarth.
- Fix batch download stop button.
- Add Pause/Resume batch download button.
- Change Target Framework to .NET Framework 4.
- Update DanbooruProviderList.xml.
small collection of my stuff!
17 thoughts on “danbooru downloader 20120229”
Please add a button to clear all batch jobs in the next version. Thanks.
Just dropping by to say thanks! This program really helps me out. Appreciate the application and the work you put into it!
The E621 downloader seems to be busted. It’s picking up the last half of the URLs. I.E. /data/blahblahblah as opposed to http://e621.net/data/blahblah
Found the problem, will be updated in next release.
i have download latest ver. of your danbooru downloader. i enter username and password in danbooruproviderlist.xml for danbooru and 3dbooru login. but i cant download from 3dbooru in full batch mode. can you help me with this? thanks before.
What is the error? If 403 Forbidden, have you check the Pad User Agent in the Settings tab? Mine is working just fine.
when i use full batch mode for downloading from 3dbooru, its always show same picture with ‘no’ watermarks.
Noted, I will update it on the next version.
What is error “403”?, happens when I put some tag on danbooru, the other addresses works fine “Sankaku Complex” “oreno.imouto.org”. This only happens on danbooru, why?
403 mean forbidden. Danbooru require login information, see in their forum. You can add the login info by editing the DanbooruProviderList.xml file in danbooru section using notepad. See the readme for more detail.
I don’t know if it’s intentional or I’m doing something wrong (-much- more likely) but –
I extended the default image limit download count to 1000 or more to make sure all images are downloaded. However, it seems that if the default image limit exceeds the actual number of images with that tag (e.g. the download limit is set to 1000, but there are only 425 images tagged “red hair”) no images will be downloaded at all.
In the end I was able to deal with this by changing the download limit to match the actual number of images with that tag (like changing it to 425 in the previous example), but I was wondering if I missed an obvious way to make sure to download all files without having to change limits.
Thank you very much for these programs, I’ve only recently tried the danbooru downloader, but have been using the pixiv downloader for quite some time.
Which provider do you use? I tried with danbooru, the ‘red_hair’ query return 397 images on websites, but using the API it found 57490 pictures.
Anyway it is still downloading until the limit reached. You can check the returned xml file used by checking the log page, try to find this line:
[DoBatchJob] Downloading list: http://danbooru.donmai.us/post/index.xml?tags=red_hair&limit=100&page=3&login=YOUR_USERNAME&password_hash=PASSWORD_HASH
If you are using different provider, it should be similar, but withou the login info.
Sorry, I should have been more specific. I’m using danbooru, and the red hair was just a rather
poor example, I know there are far more than just a few hundred images with that tag haha, sorry
for the confusion.
What I was trying to do is use a default limit that is well in excess of the number of images
belonging to any tag so I can be sure that all images with that tag will be downloaded without
having to extend the download limit. But it seems if I set the download limit to -more- images
than actually exist with that tag, it will download a single image and finish.
A real example –
I set the image limit to 950, but there are only 220 images with this tag. I figured even if I set
the download limit well over the number of images that actually exist, all 220 images would be
downloaded anyway, but only a single image is downloaded.
But if I go back and reduce the limit, everything works great and all 220 images are downloaded –
I have no problem at all with downloading to a certain limit unless the limit is higher than there
are images with a specific tag.
Like I said, I was able to get them all by just reducing the download limit and I feel like an
idiot for even bringing this up because it isn’t even a problem, I was just wondering if I could
leave it really high so I wouldn’t need to adjust the limit anymore.
Found the problem, I will fix it in the next version. You can always get the latest source code in the github if you want to compile it yourself 🙂
Yes, this time i’ve successfully compilated your source on my mac. The .exe work’s fine on my Windows 7 64bit virtual machine.
btw, is there a solution for making an .app (mac executable) with C# ?
Very nifty tool! Great job from The Netherlands. ^__^
Excellent work. I will test it some more and report any issues (if any). Thanks again!
Comments are closed.