Hi there. When downloading in Full batch Mode, I only seem to be downloading 20 pages in total. Is there a way to increase the cap?
Have you key in the limit?
When doing a full batch job how do i set for unlimited on pages/images?
currently no unlimited option, you can put large enough limit in the batch job (e.g: 100000+)
Ok but i got this error
—————————
DataGridView Default Error Dialog
—————————
The following exception occurred in the DataGridView:
System.Reflection.TargetInvocationException: Property accessor ‘ProviderListString’ on object ‘DanbooruDownloader3.Entity.DanbooruBatchJob’ threw the following exception:’Object reference not set to an instance of an object.’ —> System.NullReferenceException: Object reference not set to an instance of an object.
at DanbooruDownloader3.Entity.DanbooruBatchJob.get_ProviderListString()
— End of inner exception stack trace —
at System.ComponentModel.ReflectPropertyDescriptor.GetValue(Object component)
at System.Windows.Forms.DataGridView.DataGridViewDataConnection.GetValue(Int32 boundColumnIndex, Int32 columnIndex, Int32 rowIndex)
To replace this default dialog please handle the DataError event.
—————————
OK
—————————
reduce the max limit? can you give me the detailed query/screen shoot of the add batch dialog?
what happens when i try to do a full batch
* link edited, better remove the hash from the screenshot *
I have check the xml: You can only search up to page 1000.
I think you mixed up the page with the number of page to download, that option is for the starting page.
I’m getting this on danbooru.
—————————
Download List
—————————
Cannot load tags.xml
—————————
OK
—————————
Never get that error, try to update your .Net Framework 4
I updated, but I’m getting the same error. Are you referring to a specific update?
Nope, can you give the details, including the query search and the provider. Is it happen for every query/provider? Is it also happen in the previouse version? Probably it is caused by bad xml message (e.g: containing illegal character).
I’m using Windows XP and Windows 7 (all 32-bit) and it working ok.
I get the same error for all queries/providers. Error: The type initializer for ‘DanbooruDownloader3.DAO.DanbooruPostDao’ threw an exception.
Could it be because I’m using 64-bit Windows 7?
i have a question….in batch download..why it skipped too many…if tag have 15000 result…it just download 7500 and skipped 7500…may be due connection ? …thanks for reply
Which provider do you used and what is the query? Sometimes the result from danbooru API might have different count from the web search.
is chan.sankaku
and i dont know why oldest version danbooru downloader can easy download in danbooru….now with newest version. i can’t download in danbooru(503)…
Danbooru 503 and this one, maybe you want to check the password and the password-salt in xml provider with text editor.
Sankaku Complex doesn’t allow you to fetch the xml, only json. If you using JSON request, the total count is not given from the server, I can only try to fetch the next page until it return empty list or identical list, then stop. As for the different count, I don’t know. This program only parse the returned xml/json from the server and used that as the list.
Hi there. When downloading in Full batch Mode, I only seem to be downloading 20 pages in total. Is there a way to increase the cap?
Have you key in the limit?
When doing a full batch job how do i set for unlimited on pages/images?
currently no unlimited option, you can put large enough limit in the batch job (e.g: 100000+)
Ok but i got this error
—————————
DataGridView Default Error Dialog
—————————
The following exception occurred in the DataGridView:
System.Reflection.TargetInvocationException: Property accessor ‘ProviderListString’ on object ‘DanbooruDownloader3.Entity.DanbooruBatchJob’ threw the following exception:’Object reference not set to an instance of an object.’ —> System.NullReferenceException: Object reference not set to an instance of an object.
at DanbooruDownloader3.Entity.DanbooruBatchJob.get_ProviderListString()
— End of inner exception stack trace —
at System.ComponentModel.ReflectPropertyDescriptor.GetValue(Object component)
at System.Windows.Forms.DataGridView.DataGridViewDataConnection.GetValue(Int32 boundColumnIndex, Int32 columnIndex, Int32 rowIndex)
To replace this default dialog please handle the DataError event.
—————————
OK
—————————
reduce the max limit? can you give me the detailed query/screen shoot of the add batch dialog?
what happens when i try to do a full batch
* link edited, better remove the hash from the screenshot *
I have check the xml: You can only search up to page 1000.
I think you mixed up the page with the number of page to download, that option is for the starting page.
I’m getting this on danbooru.
—————————
Download List
—————————
Cannot load tags.xml
—————————
OK
—————————
Download the tags.xml from the previous post: http://nandaka.wordpress.com/2012/04/05/danbooru-downloader-20120405/
I’m trying to batch download using this version but I’m always getting this error:
http://img24.imageshack.us/img24/7222/errorpyq.png
Any idea what it means?
Never get that error, try to update your .Net Framework 4
I updated, but I’m getting the same error. Are you referring to a specific update?
Nope, can you give the details, including the query search and the provider. Is it happen for every query/provider? Is it also happen in the previouse version? Probably it is caused by bad xml message (e.g: containing illegal character).
I’m using Windows XP and Windows 7 (all 32-bit) and it working ok.
I get the same error for all queries/providers. Error: The type initializer for ‘DanbooruDownloader3.DAO.DanbooruPostDao’ threw an exception.
Could it be because I’m using 64-bit Windows 7?
Try this version: http://www.mediafire.com/download.php?2j0haqiuyo1b9ad
I change the target platform to AnyCPU.
I’m using WIN7 64-bit and I have no such error. Program runs @ 100% for me.
I tried the AnyCPU version, but I’m still getting the same error…
Older version (2012.03.30) is working fine though.
Try this one: http://www.mediafire.com/?7q2lpxqv2ik02hr
this one is based on the diff from the latest version to the 20120330 version and more bug fixing.
[DoBatchJob] Downloading list: http://chan.sankakucomplex.com/post/index.json?tags=order:id&limit=100&page=1087
[DoBatchJob] Error: Cannot load tags.xml
Stack Trace:
at DanbooruDownloader3.DAO.DanbooruTagsDao..ctor(String xmlTagFile)
at DanbooruDownloader3.DAO.DanbooruTagsDao.get_Instance()
at DanbooruDownloader3.DAO.DanbooruPostDao.ProcessJson(DanbooruPost& post, String& json, StreamReader reader, String& tmp)
at DanbooruDownloader3.DAO.DanbooruPostDao.ReadJSON(Stream input)
at DanbooruDownloader3.FormMain.DoBatchJob(BindingList`1 batchJob)
Query: order:id
and make sure tags.xml is in the same folder with DanbooruDownloader3.exe
where can I get tags.xml from?
Nevermind, I found it. Now this version is working! Thanks for your help.
abcde
abcde
false
choujin-steiner–%PASSWORD%– <<< i dont know this line
For Danbooru:
<UserName>your username</UserName>
<Password>your password</Password>
<UseAuth>true</UseAuth>
<PasswordSalt>choujin-steiner–%PASSWORD%–</PasswordSalt>
You might want to read this for the 503: http://danbooru.donmai.us/forum/show/24011
i have a question….in batch download..why it skipped too many…if tag have 15000 result…it just download 7500 and skipped 7500…may be due connection ? …thanks for reply
Which provider do you used and what is the query? Sometimes the result from danbooru API might have different count from the web search.
is chan.sankaku
and i dont know why oldest version danbooru downloader can easy download in danbooru….now with newest version. i can’t download in danbooru(503)…
Danbooru 503 and this one, maybe you want to check the password and the password-salt in xml provider with text editor.
Sankaku Complex doesn’t allow you to fetch the xml, only json. If you using JSON request, the total count is not given from the server, I can only try to fetch the next page until it return empty list or identical list, then stop. As for the different count, I don’t know. This program only parse the returned xml/json from the server and used that as the list.