Abnormal findings from the monitoring network security

Foreword

  In recent abnormal front-end monitoring system, we found some unusual information from which to do some analysis, get some experience, and therefore writing.

Abnormal

  Open monitoring system found one morning, the day 1:00 have been two front-end test environment reported abnormal, the cause of the error is due to not get url parameters, such as the normal address should be www.xx.com?a B = 2. 1 & = , but is actually accessed www.xx.com 3Fa%%% 3D2 3D1% 26b . Obviously path encode , resulting in the program did not get the argument.

  Access path 2 abnormality is not the same, and more than two address   decode after another visit, can open the page properly, all the parameters are correct. These access url is when we do the jump, the configuration of the entrance, our logic does not encode , this place is very puzzling, why is there such a request, speculation is likely to be artificially modified the longer accessible. Later parameters to get out, to query some of the information found on this request comes from our students test account, but the privately asked, 1:00 colleagues had done nothing at all. Also ruled out other people using his cell phone to access the situation. And then do a query parameter in the information and found that these two parameters corresponding data in more than two months ago, we project the testing phase of the data, which is more strange, version two months ago we have not tested again , colleagues for a long time have not visited it several entrances. It is not someone in the brush our pages, but why so few parameters are accurate, correct. 

  There must be other people access. Because all aspects look very unusual:

    1. 1:00 had access

    2. Access path encode the

    3. Data is a test of my colleagues, he has not visited.

    4. Access the address data is generated in two months ago, after a hacker if there are three parties to obtain access to the records, delayed access to the most recent batch is also consistent with the behavior of logic

 

  Look at other unusual information reported by more strange, the browser version is not known, the User-Agent Only Go-Client-HTTP / 1.1 , how to look like reptiles script doing the request. Speaking of reptiles, reptile will generally be known for pulling data interface to obtain information about others; Or stop traverse different paths, find the route accessible (hidden back door), path traversal is easy to find, If there is, then the log, you can see a lot of 404 , a large number of access to some strange paths. Then we encountered was intercepted early in some way to our web page requests to collect up to a certain time to go visit this manner is called: replay attacks .

  We query the server and the  IP  number and other relevant information, found that there are several IP from time to time in doing such access attacks, but also see each other very cautious, while not big batch request. They found more abnormal behavior, such as a first www.x.com/page page path, it uses the post method to request again. 

  All abnormal data path is being accessed address test environment, the use of HTTP , and our environment official use https , the ability to get the other party does not seem to https request. By querying these IP  in a formal environment visit, also found a record, but just visit this record does not address https ( because sometimes, our own development may manually change http to https to access ) . To the conclusion that the current listening hacker party and our project is not related to the environment, just as we officially environment uses https he did not get to.

  Initially I thought it was only a co-worker's phone was listening, but since has emerged the request of another colleague, so that a greater probability of reptiles in the router layer interception, and two colleagues have just a use of Andrews, with a is Apple, so it looks for all devices could hit. But if the problem is the corporate router, why would so far found only visit two colleagues were crawling, and these pages are also visited other colleagues, and the frequency is not low. What is your strategy in particular, Where interception, I am a bit helpless. After then asked another colleague, that the other probably visit more than 20 days ago, reptiles record access all concentrated in these days. There are indications that indeed pre-collection for some time, these two geniuses start out centrally accessible.

  Front by querying the IP , the servers are reptiles found in the country, but the other party may hide the real IP. If the person's behavior hurt the company, you may be able to investigate further the person who has the other server. But because the attacks from the other side visits and procedures, it almost did not hurt us. Even feel the other is not directed to attack us. It looks more like other free programs on the Internet to collect what it means to do something.

 

safe question

  Come to talk about security issues, we often visit url time, bring some parameters, such as in app , the embedded some of the H5 , H5 parameters need to bring some identifiable token . If we say that we do not use HTTPS , the hacker in the middle tier to listen to our request, we will be able to get to the data, even in case the other party has been token , we can get more privacy information. This provides a replay attack security vulnerabilities.

  Then the defense replay attack, what can we do? There are two commonly used methods

  (1) protection plus a random number is used only once, but need additional space to store historical values ​​used;

  (2) add a timestamp, no additional information is saved, but the need to synchronize the time, but can not achieve accurate synchronization time in each case. In short an argument about the direction that is needed to determine whether failure.

 

Why https guarantee safety

  Many online articles here do not explain everything, and simply say two things, HTTPS use asymmetric encryption and symmetric encryption to ensure data security; use of digital certificates to do two-way identity verification, guarantee will not be fishing.

 

Why capture tool can crawl https

  Previously known packet capture tool will be able to crawl through a certain configuration to https request, so I thought, that https is not not safe ah, packet capture tool can be intercepted. First, we need to first understand how it works, in order to do further analysis. 

 

  Packet capture tool crawl https principle is to use packet capture tool to do intermediate agents , and the client to establish a trusted connection later, again on behalf of the client destination server to send and receive requests, to the effect of middlemen, so both packet capture tool see the data, the client can be used normally.

The following is a packet capture tool crawl https processes ( summary copy others, do not know who wrote it originally, because too many of the same online authors Wuguai, not signed )

  1. The client sends a request to the server HTTPS

  2. Charles intercept the client request, disguised as client request to the server

  3. The server returns the server to the "client" (actually Charles) CA certificate

  4. Charles intercept the server response, obtain the public key of the server certificate, and then make your own a certificate sent to the client after the server certificate replaced. (This step, Charles got the server's public key certificate)

  5. After receiving the "server" (actually Charles) certificate, generates a symmetric key encrypted with the public Charles transmits to the "server" (Charles)

  6. Charles intercepted response client, with his private key to decrypt the symmetric key and server certificate with public key encryption, is sent to the server. (This step, Charles got a symmetric key)

  7. Server using its own private key to decrypt the symmetric key, sends a response to the "client" (Charles)

  8. Charles intercept the server response, after replacing its own certificate to the client

  9. Public consultation with the client and the server At this point, the connection is established, Charles got the server certificate symmetric key, then you can decrypt or modify the encrypted message

 

  Therefore, provided that the client needs to trust and choose to install Charles certificate shall packet capture tool can not intercept https, most malicious script, you want to grab user data on the Internet, and also mostly the same packet capture tool works Therefore https is still relatively safe.

 

Not use https?

  The company did not remember to use https when we may have experienced nausea and traffic hijacking, our own online environment using all kind of normal, and my colleagues from time to time in other regions told us that floated a button on the page, point to open I went to other places to go. Some even say that the injected code problems, leading to our interface is also a problem, then after various studies, the last way is on https,  and now no longer have to worry about this problem . Now, some browsers to access non- https page will prompt unsafe, usually also see some of the other people's sites are still useless https , will report into the warning. As we also discovered replay attacks , these security problems do exist in the network, so that no one can avoid, so useless https site, or upgrade it quickly!

Guess you like

Origin www.cnblogs.com/1wen/p/11836952.html