After the Chinese character length problem json_encode

Problem Description:

It will be a string $ str json encoding, that after json_encode ($ str) becomes Unicode characters stored in the database, you will find the length of the Chinese is obviously not the character length exceeds the maximum, but the field is too long throw error ;
See Chinese character database field is escaped into the special alphanumeric string (not unicode characters)

problem causes:

MySQL only supports basic plane multilingual characters (0 × 0000-0xFFFF). Please try to store a synonym opposite :)
MySQL 5.5.3 above (which has not yet GA), support for supplementary characters if you use UTF8MB4 coding.
json_encode Chinese when each Chinese character will encode into "\ uxxxx", and deposited into the database, "" was blocked, and directly into "uxxxx"

How to solve:

  • Method One: php5.4 version can json_encode ($ str, JSON_UNESCAPED_UNICODE) to avoid Chinese characters are converted to unicode. However, if "\ t" Character string exists in the data fetch json_decode (str) is converted fails;
  • Method two: json_encode (urlencode ($ str) ) The characters to be urlencode (); then json_encode ();
    after taking data urldecoded (); i.e.: json_encode (urlencode ($ str) ) taken out from the database: urldecode ($ str )

  • Method three: $ str = json_encode ($ str);

    $ Test = addslashes ($ str); // or

    $test= mysql_escape_string( $str );

Guess you like

Origin www.cnblogs.com/xinxinmifan/p/11672158.html