Some records of C# forged requests (problems encountered, experiences, notes)

Written at the beginning: This kind of thing seems to be done more in python, but I haven’t learned it yet, so I use C# to implement it. Since I am not good at it, please forgive me if I understand it wrong.

Preparation, packet capture tool

Experience using Fildder packet capture tool

官网下载:Fiddler | Web Debugging Proxy and Troubleshooting Solutions

Catch HTTPS requests: Export CA certificate——Tools——HTTPS——(check all and select the second Export Root Certify to Desktop in Actions), browser Install the CA certificate and restart the software PS: Different browsers import in different ways. For Chrome browser, it is Settings-Privacy Settings or Security-Security-Manage Certificate-Import

Set Filters to filter requests:Commonly used (request headers show only if URL containsOnlyshow contains URL path of this string)

Add Method (request type) column: Right-click the column and select customize columns, select the last Miscellaneous in the collection, then select the first RequestMethod in the field name and add it

Fiddler Traps: On a computer that has a proxy network set up, if Fiddler closes unexpectedly (such as the task manager forcing it to close), the original proxy status will be cleared, and the proxy network needs to be reset!

The password form is JS encryption:Use F12 of chrome to open the login page of the website - F12, refresh the page, click Search, search for password and other fields (pwdencrypt), it will Discover which js file method is called, copy the content of the js file, create a new js file in VS and paste it, comment out the closure (//(function() {...})), and change the js file attribute generation operation to "embed Resources", and then write a method to call the js

        /// <summary>
        /// 执行JS
        /// this.ExecuteScript("get('{0}')".FormatWith(token0), File.ReadAllText(Server.MapPath("./encodejs.js"))).toUrlEncode();
        /// </summary>
        /// <param name="sExpression">参数体</param>
        /// <param name="sCode">JavaScript代码的字符串</param>
        /// <returns></returns>
        public static string ExecuteScript(string sExpression, string sCode)
        {
            MSScriptControl.ScriptControl scriptControl = new MSScriptControl.ScriptControl();
            scriptControl.UseSafeSubset = true;
            scriptControl.Language = "JScript";
            scriptControl.AddCode(sCode);//window未被定义 去js定义一个 var window = {};
            try
            {
                string str = scriptControl.Eval(sExpression).ToString();
                return str;
            }
            catch (Exception ex)
            {
                string str = ex.Message;
            }
            return null;
        }
        /// <summary>
        /// 调用Js的加密
        /// </summary>
        /// <param name="password"></param>
        /// <returns></returns>
        public static string GetYYPwd(string password)
        {
            System.Reflection.Assembly asm = System.Reflection.Assembly.GetExecutingAssembly();
            StreamReader txtStream = new StreamReader(asm.GetManifestResourceStream("Demo20220530.JS.pwd.js"));
            string str = txtStream.ReadToEnd();
            string fun = string.Format(@"enc('{0}')", password);//加密方法 PS:这个方法是没有的,自行在js创建并调用加密方法
            string result = ExecuteScript(fun, str);//要执行的方法,JS
            return result;
        }


        //js里自行创建的enc方法
        //function enc(pwd)
        //{
        //    return RSAUtils.encryptedString(pwd);
        //}

Chrome F12 commonly used:

Fetch/XHR: A method of HTTP data request

Document: You can see which web pages were requested

For these two data, click on the name to see the information corresponding to the request, header (request header, response header, etc.), payload (postData of post request), response (what was responded to, displayed in text form), and then based on The required parameters are forged to achieve the purpose.

Problems encountered: There are these parameters when forging ASP.NET Post requests: __EVENTTARGET, __EVENTARGUMENT, __VIEWSTATE, __EVENTVALIDATION. These parameters are the first time you visit. When viewing the page, you can find that there are hidden fields in the response page that store these values

 We get these values ​​through regular expressions

            regex = new Regex("input type=\"hidden\" name=\"(?<Name>.*?)\".*?value=\"(?<Value>.*)\"");
            MatchCollection mc = regex.Matches(result);
            string postData = string.Empty;
            foreach (Match m in mc)
            {
                postData += m.Groups["Name"].Value.Trim() + "=" + HttpUtility.UrlEncode(m.Groups["Value"].Value.Trim());
                if (postData != null)
                {
                    postData += "&";
                }
            }

You can find that I have performed URL encoding hereURL encoding (HttpUtility.UrlEncode). This pit has been stuck for a long time, and finally it was found that it was an encoding problem. , just add the value to postData after encoding it.

Asp's website basically has this kind of verification__VIEWSTATE. The process is basically the first After visiting the website once, take out the content of the hidden domain (__VIEWSTATE), and then bring this parameter.

Problems encountered:When constructing a request for a website, an encoding problem was found, for example: normalChinese URL After encoding it will be like this %E6%98%9F%E7%A9%BA%E6%B8%AC%E8%A9%A67, and then the problem encountered is that it is passed to The encoding in the background is actually encoded twice, that is, HttpUtility.UrlEncode(HttpUtility.UrlEncode("text")).ToUpper(), which is encoded twice and has to be converted into Capital (ToUpper), otherwise the verification will still fail.

                string name = string.Empty;
                regex = new Regex("[A-Za-z0-9_]");
                foreach (char c in textBox1.Text.Trim())
                {
                    if (!regex.IsMatch(c.ToString()))//如果不匹配正则(英文不需要进行转码操作)
                    {
                        name += HttpUtility.UrlEncode(HttpUtility.UrlEncode(c.ToString())).ToUpper();//中文需要进行转码大写化操作
                        continue;
                    }
                    name += c;
                }

Problems encountered (src spoofing):To crawl the pictures of a website, usually check the corresponding web page, then find the control of the picture to be crawled, and then use Go to their (their) network source address, then request and save it. However, I encountered a problem here and found <img src='//c.....' data-original= 'aHR0cHM6Ly9zMS5jaHUwLmNvbS9zcmMvaW1nL3BuZy9....', the address in the src of this img is fake, I took the bait of course. After asking the boss and Baidu for various operations, I found this data-original which is base64 encrypted After decrypting it, the F12 review page elements are compared with the address of the image, they are exactly the same! ,

This article is to make a record for myself. Of course, everyone is welcome to learn and exchange.

Guess you like

Origin blog.csdn.net/qq_51502150/article/details/125160334