Hacker's pathfinder dog ReconDog website information detection and collection tool

Download and run Recon Dog, if the permission is not enough when running, you can use chmod +x dog.py to run it
Enter the following command in the terminal to download

After downloading the program, enter the following command to enter the Recon Dog directory and list the contents

Now run the script with the following command.

Software function

  • 1. Whois Lookup
  • 2. DNS Lookup + Cloudflare Detector
  • 3. Zone Transfer
  • 4. Port Scan
  • 5. HTTP Header Grabber
  • 6. Honeypot Detector
  • 7. Robots.txt Scanner
  • 8. Link Grabber
  • 9. IP Location Finder
  • 10. Traceroute
1.whois information:
Let's first look at the first option, whois , which is a database used to query whether a domain name has been registered, and the details of the registered domain name (such as the domain owner, domain registrar, domain registration date and expiration date, etc.). By querying the domain name Whois server , you can query the contact information of the domain name owner. The significance of this step is not very big, but it is a preparation before stepping on the spot, so as to prevent the leader from suddenly asking and taking you by surprise
2. DNS resolution:
The second option is DNS resolution, which can be queried against the specified domain name to see if the domain name belongs to those dns to parse. Most of the domain names registered in China are dns parsed through Wanwang, Alibaba Cloud, etc. dns is a conversion The process of converting the registered domain name into the IP address of the public network, which is exactly what we want to resolve, and the corresponding reverse resolution, which refers to the correspondence and conversion of IP addresses to domain names . ddos, you might as well try denial of service for dns, the effect is really not good, but it will also cause a large area to be unable to access the Internet
3. DNS domain transfer:
DNS domain transfer, everyone has little contact with it, and rarely heard of it. Generally speaking, it means that the wrong configuration of the DNS server can cause the resolution record to be resolved not by the main DNS, but by implementing fake DNS. It is commonly used in Hijacking and other operations are performed in the public network environment , but the general impact is relatively large. This is similar to the arp hijacking operation of ettercap.
4. Port Scan:
Since the tool uses the API of the specified domain name as the calling interface, we are scanning. We will not expose our own IP address, which is equivalent to killing someone with a knife. Xiaobian personally tested it. When using nmap to scan the target in kali, we could not get detailed results, but through the port scan of this tool, we got detailed results. Although it is not very detailed, it also lays the foundation for the next operation.
5. Web Fingerprint:
When it comes to fingerprints, it is the same as a single language. Then the service fingerprints we often say are the same and unique. Even if we use some tools to clone the specified site, it will not be the same. This module It is mainly used to determine the target of the attack to ensure that we will not make mistakes and make mistakes.
 
Also in kali-linux, there are similar tools, and whatweb is one of them. It is almost the same as our protagonist's module, but don't forget, our protagonist uses api as an interface, which further reduces our exposure. Opportunity
6. Honeypot detection:
To tell the truth, the editor did not look at this module in detail, so I did not make an evaluation, but simply tested it. I thought I was too naive. I can't tell whether it is a honeypot based on my current experience, so I won't go into details here. Small partners with strong security awareness should not try it every penetration test. If you find a honeypot, don't forget to buy a lottery ticket.
7.robots file enumeration:
robots.txt is the first file a search engine looks at when visiting a website. When a search spider visits a site, it will first check whether robots.txt exists in the root directory of the site. If it exists, the search robot will determine the scope of access according to the content of the file; if the file does not exist, all 's search spiders will be able to access all pages on the site that are not password protected. On the one hand, you can use the robots file to block search engines, and you can also use this file to improve the ranking of the website. Similarly, the default robots file is likely to expose the relevant directories of the website. If the permissions are too loose, it may cause unauthorized access, or directory traversal, etc. Vulnerability
8. Link detection:
I think the module is more useful for developers, because the target site is spending on development, it is very likely that it is only for the fairness of the front end, while ignoring the links in the middle, then this tool can help us analyze the broken links. , it became a 404 status,
9.ip location:
This is the process of converting ip to latitude and longitude. Usually, we directly use the ip address to query the physical location, but we can only find the approximate city, and some can be accurate to specific cells, but this tool can be more accurate. Or you can also use the pure ip library , you might as well try your own ip address
10. Routing hops:
kali-linux also has a related routing hop tool, traceroute, which is mainly used to view the specified sites accessed by the current host through which routes, which firewalls or bridges and other devices have passed through, and which routing points can be seen in the link. It takes the longest time to help us choose the fastest routing line and reduce the delay of data in the transmission process

postscript:

Seeing this, do you think this tool is very good, and it is cool to borrow a knife to kill? But the author can only tell you that if you use it properly, you can think about it this way. If this server is hacked by a third party, then the information you collect through his api will be read by him. If he saves this information Woolen cloth? So it's worth reflecting on here, like some shared services, such as http proxy servers, shared ftp servers, or similar shared servers, how do you know if the owner of the server will silently collect your data in the background? information? Under the banner of an update, silently insert a Trojan horse for you? Here is a question for you to ponder.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=324877864&siteId=291194637