[Python web crawler] 150 lectures to easily get the Python web crawler paid course notes chapter 14-data storage: CSV file read/write

1. Read csv file

  • reader: return a list, get a column value by subscript
  • DictReader: return a dictionary, get a column value by key value
import csv

with open('stock.csv', 'r', encoding='gbk') as fp:
    reader = csv.reader(fp)
    for x in reader:
        print(x)
        print(x[3])


with open('stock.csv', 'r', encoding='gbk') as fp:
    reader = csv.DictReader(fp)
    for x in reader:
        # print(x)
        print(x['secShortName'])

 

2. Write to csv file

  • writerow: write one row of the list
  • writerows: write multiple rows of the list
  • DictWriter: Use the dictionary to write data
import csv

headers = ('name', 'age', 'height')
student = [
    ('张三', 12, 160),
    ('张三', 13, 170),
    ('张三', 14, 180),
]

students = [
    {"name":'张三', "age": 12, "height": 160},
    {"name":'张三', "age": 13, "height": 170},
    {"name":'张三', "age": 14, "height": 180}
]

#newline=''使一行写入后不会换行, 否则行行之间有一行空行
with open('student.csv', 'w', encoding='utf-8', newline='') as fp:
    writer = csv.writer(fp)
    writer.writerow(headers)
    # for i in student:
    #     writer.writerow(i)
    writer.writerows(student)


with open('student.csv', 'w', encoding='utf-8', newline='') as fp:
    writer = csv.DictWriter(fp, headers)
    # 虽然DictWriter 创建时有一个headers, 但是真正写入表头数据还是需要调用writer.writeheader()方法
    writer.writeheader()
    writer.writerows(students)

 

Guess you like

Origin blog.csdn.net/weixin_44566432/article/details/108723867