Write stored procedures in Rust and use the pgxr program library in the database;
Use Go language outside the database and use pgx to connect to the database for query;
The logic is to query the field list of a table and execute it 10,000 times in a loop;
The test results are as follows:
Rust stored procedure:
test_sql_speed: 26.810285862s
Go connection database query:
32.746561715s
The Go language only establishes a connection once.
It seems that the overhead of multiplexing connections is very small, and it only takes about 0.5 milliseconds at a time.
Then, I tested the simplest SQL query: SELECT 1, also 10,000 times;
This time, the Rust stored procedure:
test_sql_speed: 67.651917ms
Go connection database query:
1.261617769s
The query in the database is quite fast, so it takes about 0.1 milliseconds to process each connection.
The source code is as follows:
Rust
#[no_mangle]
pub extern "C" fn test_sql_speed(_fcinfo: FunctionCallInfo) -> Datum
{
let sys_time = SystemTime::now();
for _ in 1..10000 {
let _i = query_for_int("select 1");
}
let difference = SystemTime::now().duration_since(sys_time)
.expect("SystemTime::duration_since failed");
eprintln!("test_sql_speed: {:?}", difference);
PG_RETURN_I32(1)
}
Go
func main() {
db := openDbConnection()
start := time.Now()
i := 0
for i = 1; i <= 10000; i++ {
db.Query(`SELECT 1`)
}
t := time.Now()
elapsed := t.Sub(start)
fmt.Printf("%v\n", elapsed)
}
Later, it was found that the method for querying table fields was not efficient. It was searched from information_schema
this ANSI standard directory. Later, after reading some information, I changed pg_catalog
it to search from this native directory, and the performance was greatly improved.
It took only 1 second to query 10,000 times in Rust, and 3 seconds to query 10,000 times in Go.