I've seen this question posted before... however they didn't answer all my questions.
To set the stage: Our enterprise has an online product which, to Qualys, has many many links. However, the vast majority of these are auto-generated (eg: mysite.com/mypage.aspx?ID=826582&anotherVaule=report).
I currently have my scanning broken out into 2 web applications using whitelists and blacklists to divide it up. The problem is that one of those .aspx pages is listed as having over 8000 links itself, and the entire site is listed as approximately 30,000.
I understand that Qualys WAS scans utilize randomization in order to ensure that the entire site will eventually be scanned. However, it seems as though my WAS scans continue to report vulnerabilities as "New" even though they were previously found.... they just didn't exist in the previous scan because the links were maxed out and that particular section was skipped.
<personal-opinion>My feeling is that the 8000 link max as well as the 24-hour time limit are the two most limiting factors of the entire QualysGuard platform. Why should I not be able to run a scan that takes 3 days? In-house, I can schedule as big a backup job as I want and if the backup job just so happens to still be running when said job is scheduled to run again... it simply does not run. Nothing blows up. The scheduled job is missed.</>
Anyhow, feel free to ignore my opinion. I am expressing this because I am a paying customer. That being said, I would very much appreciate guidance on how to best set up my web application scanners. Asides from my small beef above I very much enjoy working within the QualysGuard platform. It is a fantastic product that I genuinely enjoy using.