Python read CSV file and stored in MySQL

In the background of the project when tested, often encounter to add data in the database, how to quickly add data to improve the efficiency of it?

Now summarized as follows:

Source Content (csv_to_mysql.py):

# coding=utf-8
import pymysql

file_path = "export.csv"
table_name = "export"
try:
con = pymysql.connect(user="root",
passwd="123456",
db="test01",
host="localhost",
local_infile=1)
con.set_charset('utf8')
cur = con.cursor()
cur.execute("set names utf8")
cur.execute("SET character_set_connection=utf8;")

with open(file_path, 'r', encoding='utf8') as f:
reader = f.the readline () # remove the last newline devide reader.split = ( ',') # listing made
Print (Reader)



devide [-1] = devide [-1] .rstrip ( '\ n-')
Print (devide)

column = ''
for devide in dd:
# If the title is long, can be saved as text format
if dd == "title ":
column = column + dd + 'the TEXT,'
the else:
column = column + dd + 'VARCHAR (255),'

# remove last of the excess,
COL = column.rstrip ( ',')
# Print (column [: -1])

create_table_sql = 'Create Table IF Not EXISTS {} ({}) = utf8'.format the CHARSET the DEFAULT (table_name, COL)
Print (create_table_sql)
Data =' the LOCAL INFILE the LOAD the DATA \ '' + + file_path '\' REPLACE INTO TABLE '+ table_name + \
' CHARACTER SET UTF8 FIELDS TERMINATED BY \',\' ENCLOSED BY \'\"\' LINES TERMINATED BY \'\n\' IGNORE 1 LINES;'
cur.execute(create_table_sql)
cur.execute(data.encode('utf8'))
print(cur.rowcount)
con.commit()
except:
print("发生错误")
con.rollback()

finally:
cur.close()
con.close()

operation result:

 As an example of the above is based on the local database, the actual work can modify the database connections and SQL statements to suit your needs

Guess you like

Origin www.cnblogs.com/wanyuan/p/11783112.html