GoogleandBeyondAdvancedSearchEngineHackingand.pptVIP

  • 9
  • 0
  • 约7.81千字
  • 约 43页
  • 2017-03-13 发布于湖北
  • 举报
GoogleandBeyondAdvancedSearchEngineHackingand

GOOLAG SCANNER Goolag Scanner enables everyone to audit his/her own web site via Google It uses one xml-based configuration file for its settings Screenshot of GOOLAG SCANNER SITEDIGGER Automated Google hacking tool from Foundstone Uses Google API Uses Google Hacking Database SiteDigger searches Google’s cache to look for vulnerabilities, errors, configuration issues, proprietary information, and interesting security nuggets on websites Screen shot of SITE DIGGER Countermeasures Keep sensitive data off the web!! Do not display detailed Error Message Do not allow Directory Browsing Perform periodic Google Assessments Update robots.txt (For examples and suggestions for using a robots.txt file, see ) Use meta-tags: NOARCHIVE /remove.html. This is bad! How To Protect Your Websites From Google Hackers Use a robots.txt file to prevent Google and other search engines from crawling your site if it shouldn’t be crawled. ROBOTS.TXT Example This example allows all robots to visit all files because the wildcard * specifies all robots: User-agent: * Disallow: This example keeps all robots out: User-agent: * Disallow: / The next is an example that tells all crawlers not to enter four directories of a website: User-agent: * Disallow: /cgi-bin/ Disallow: /images/ Disallow: /tmp/ Disallow: /private/ Robots.txt Cont.. Example that tells a specific crawler not to enter one specific directory: User-agent: BadBot # replace the BadBot with the actual user-agent of the bot Disallow: /private/ Example that tells all crawlers not to enter one specific file: User-agent: * Disallow: /directory/file.html Note that all other files in the specified directory will be processed. Example demonstrating how comments can be used: # Comments appear after the # symbol at the start of a line, or after a directive User-agent: * # match all bots Disallow: / # keep them out Few interesting Websites Archive of websites (Time Machine) Find out when your email gets read, Retract, Certi

文档评论(0)

1亿VIP精品文档

相关文档