Summary of several problems encountered after getting started with doris

1. Access denied for user ‘anonym@null’ (using password: NO)

When JDBC accesses the doris database to query data, the error is reported as above. It is a driver class setting error. The solution is as follows:

From JDBC_DRIVER = "com.mysql.cj.jdbc.Driver";
to JDBC_DRIVER = "com.mysql.jdbc.Driver";

2. timeout when waiting for send fragments RPC. Wait(sec): 5, host: xxx(ip)

java.sql.BatchUpdateException: timeout when waiting for send fragments RPC. Wait(sec): 5, host: 10.0.102.153
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)

The timeout error encountered when inserting data in batches. The default timeout time is 300s. You can log in to the mysql client and use commands to customize and modify the query_timeout parameter. If I change it to 3000s here, no error will be reported.

SHOW VARIABLES LIKE “%query_timeout%”;
SET query_timeout = 3000;

3. Failed to initialize JNI: Failed to find the library libjvm.so.

After installing the new version of doris 1.2.0, the BE node fails to start, and the error is as follows:

W1212 05:39:14.999809 117327 doris_main.cpp:382] Failed to initialize JNI: Failed to find the library libjvm.so.

The reason is: the latest version has an apache-doris-java-udf-jar-with-dependencies-1.2.0-bin-x86_64.tar.xz file in addition to the BE and FE installation nodes, which need to be included in this file The BE node can only be started after decompressing the jar package and placing it in the bin path of BE.

4. An error is reported when the json file exported from the mysql library is larger than 100M

The size of this batch exceed the max size [104857600] of json type data data [ 149275596 ]. Split the file, or use ‘read_json_by_line’",

Modified the suggested read_json_by_line parameter and restarted be invalid.
Modified 104857600, the parameter corresponding to 100M, did not take effect. Finally, it was added to the import command. Refer to my json file import command:

curl --location-trusted -u root: -H “format: json” -H “fuzzy_parse” -H “exec_mem_limit: 5368709120” -H “strip_outer_array: true” -H “streaming_load_json_max_mb:5000” -T 202212131453.json http://10.0.102.110:8030/api/DORISDB/wh_ab_base_assetinfo/_stream_load

5. Importing csv format file into database reports an error

According to the link in the error message, open the query error message from the browser. It is recommended to import a small amount of data to test first, and check whether the error is caused by the inconsistent number of fields and field length of the csv file.
I am reporting an error here because there is a newline character in the source data. The rules for this newline character are different, and it is easy to replace it manually, but there are various problems when the data volume reaches the million level, such as a line with multiple English commas. Yes, this import method is not recommended. Although the csv file format is smaller, I personally think it is more suitable for export. (If you have a better solution for importing csv big data files, you can teach me, and there will be rewards.)
Recently, I have encountered some problems when I first came into contact with doris, and I have just started research. Are you interested in learning together?
insert image description here

Guess you like

Origin blog.csdn.net/qq_42887496/article/details/128381672
Recommended