Change log:
- Fix SubString error in MakeFilename if there is no ‘/’ in the filename format or save folder.
- Add blacklist tag support.
- The blacklisted post is still shown in the Main Tab, but with different background color, the color can be configured in Settings Tab.
- The blacklisted post is skipped in the Full Batch mode, see the Log tab for more details.
- Separate each blacklisted tag with space.
- Add option to replace empty filename format.
- Add check if the file_url is empty.
- Fix query url generator.
- Add application icon.
Download link here, tags.xml (Sankaku Complex – 20120429) here, source code in GitHub, donation link in sidebar :).
Why the picture preview isn’t working?
I tried to search from yande.re …
The thumbnails? Check the Load Preview in the Danbooru Listing group box.
Thanks , doesn’t see the checkbox there XD
I’m not sure why but the application keeps on crashing when trying to batch download 25k+ posts from sankaku/danbooru. I’m currently using %id% %character% [%artist%] (%copyright%) as my filename format with 240-250 character length.
what is the crash log? or you can try to download in smaller batches?
It’s the close program window which doesn’t have an option to view error log. I’m downloading with 230 character length and so far it’s working fine.
I’m having similar problems
http://i.imgur.com/BCaTS.png
Defaults all around mostly.
Login auth only for Danbooru provider.
%searchtag%%artist%%copyright%%rating%-%character%-%md5%
rename .jpeg to .jpg
Doing a batch job for search query soejima_shigenori with a 1000 limit and all providers selected.
Sometimes it crashes on Danbooru, sometimes on Sankaku, sometimes on Konachan.
The same crash log as my last comment, this time it crashed while downloading the list from fairygarden, having no problem with any of the other providers prior (including danbooru, sankaku and konachan ~ however I am now excluding thedoujin.com and 3dbooru because they seem to take a long time and produce no results).
This time I ran the batch query with DD3 set to high priority and kept it in focus.
I just went through a batch of about a dozen different queries for different artists without any problems.
Same settings as previous comments and searching all providers excluding: TheDoujin.com, 3dbooru, and fairygarden.
Danbooru, Sankaku, and Konachan don’t seem to cause crashes anymore.
>> http://fairygarden.no-ip.org/shimmie2, http://behoimi.org
unable to connect, probably website down.
>> http://thedoujin.com
have malformed xml, this one I don’t know how to fix, except you download the list xml, modify it and load it manually.
Nope. It crashed again.
not even a dialog box is shown? Usually if there is an unhandled exception, a dialog box from .Net runtime will be shown containing the log.
I suspect there might be a memory problem, can you check the memory size? try to tick the abort on error box?
if you’re talking about the paging file. I have it set at 500MB.
No, the actual memory usage for the application.
It’s consuming around 100MB with 100k batch job running.
the memory consumption is still ok, may I know what is the query? when the application fail (can you check after which file/post)?
Try this version: http://www.mediafire.com/?to3fb0fg1nlcy7d
I have added proper logging.
These were my provider settings when the application kept on crashing:
Provider: sankaku complex
Hard limit: 1000
Default limit: 100000
Tag query: order:id
Filename format: %id% %character% [%artist%] (%copyright%)
Filename length: 250
UserAuth: tried with and without (got more crashes with auth enabled)
Now I’m trying the version you made for me with 100 as the hard limit and 20 as default limit. Thanks.
I forgot to mention also:
Abort on error is on
Empty tag replacement: tagme
Reduce the default limit to match the hard limit. I got 400 Request Header Or Cookie Too Large error when do batch job.
And there is bug for default limit and hard limit. The default limit is used if the user did not supply any limit in the add batch download, and the hard limit is used if the given limit is over the hard limit (this including the default limit), so the query will be break into multiple request.
Also another bug for null reference because of the 400 error, for now try to reduce the limit.
After trying all that, I still got the same error:
http://img834.imageshack.us/img834/1701/dderror.png
I posted the above comment before reading your latest comment my bad. Going to try your suggestions now.
I reduced the limit to 10000 per job and I’m still getting the same crash.
Any chance of seeing zerochan.net support in the future? 😀
maybe not? I don’t think their board are used in anywhere else beside their own.
in future….it including deviantart ?
nope
how to add blacklist? i cant input in the blank behind “Tag Blacnklist”
can i use the blacnklist in the normal mode?
and. what “Empty Tag Replacement” mean?
THX.
You can left it empty if you don’t want to use it (no blacklist).
As for the empty tag replacement, lets say you are using %artist% in the filename format, but the downloaded post doesn’t have any artist tag, so it will be replaced with the given text in the empty tag replacement. The default value is empty string.
oh…THX.
and i want use the blacklist.
i often down many picture then search monochrome and delete this picture.
and i have a question about Full Batch mode
first, i start the Full Batch mode and down a picture
next time i continue this task
but the picture’tag changed (add or delete a tag)
the program will down two same picture?
>> the program will down two same picture?
Probably, it is depend on the filename format. If you use the tag as the filename, then it will be downloaded again because the filename is different.