Why some websites have the TLS_RSA as the cipher but without the ROBOTATTACK vulnerability?
Please refer this: https://discussions.qualys.com/thread/17842-requirements-for-being-labeled-as-vulnerable-to-robot#comment-39243
Can you please provide some additional information as to how QID 38695 detects the TLS Robot Vulnerability? Results being returned include 'ROBOT vulnerability found with a weak oracle' post patching our Infrastructure according to Robotattack.org recommendations.
We have the patched Netscaler and confirmed the version is not vulnerable, buy the scanning still show the website is vulnerable to ROBOTATTACK, how ssllab test the the RSA implementation bug?
Our test is similar to the original robotattack.org script. GitHub - robotattackorg/robot-detect: Detection script for the ROBOT vulnerability. You can PM the hostname to me for investigation.
I have downloaded the script from github and tested myself, result shows our servers are not vulnerable even we have TLS_RSA ciphers but ssllab stay say our sites are vulnerable to the attack.
Also, want to share that if SNI is used, the result will say it is vulnerable.
Any similar result from other parties?
robotattack.org test doesn't support SNI whereas SSL Labs test supports SNI.
Retrieving data ...