net / http download
After golang, if we want to download a file, the easiest is to first create a remote request with http.get () method can be used later ioutil.WriteFile () and other content directly to request written to a file.
func DownFile() {
url :="http://wx.qlogo.cn/Vaz7vE1/64"
resp ,err := http.Get(url)
if err != nil {
fmt.Fprint(os.Stderr ,"get url error" , err)
}
defer resp.Body.Close()
data ,err := ioutil.ReadAll(resp.Body)
if err != nil {
panic(err)
}
_ =ioutil.WriteFile("/tmp/icon_wx.png", data, 0755)
}
But you will find that the above operation will be a small problem, it is okay to download a small file, if the file is large, then, out of memory problem may occur because it is the need to put the request to read the entire contents memory, and then written to the file.
So, if you want to download large files or copying large files, how should you do it? In fact, Golang in the provision of a io.copy
method, it is copied directly between the file pointer, do not read into memory whole, we can solve this problem.
io.copy
We look at the prototype declaration
func Copy(dst Writer, src Reader) (written int64, err error) {
return copyBuffer(dst, src, nil)
}
func copyBuffer(dst Writer, src Reader, buf []byte) (written int64, err error) {
....
if buf == nil {
size := 32 * 1024
if l, ok := src.(*LimitedReader); ok && int64(size) > l.N {
if l.N < 1 {
size = 1
} else {
size = int(l.N)
}
}
buf = make([]byte, size)
}
It is to copy the source to the target, and is based on the default buffer 32k cycle operation, the contents will not be a full-time write memory, so you can solve the problem of large files.
We then io.copy
come to realize it.
func DownFile() {
url :="http://wx.qlogo.cn/Vaz7vE1/64"
resp ,err := http.Get(url)
if err != nil {
fmt.Fprint(os.Stderr ,"get url error" , err)
}
defer resp.Body.Close()
out, err := os.Create("/tmp/icon_wx_2.png")
wt :=bufio.NewWriter(out)
defer out.Close()
n, err :=io.Copy(wt, resp.Body)
fmt.Println("write" , n)
if err != nil {
panic(err)
}
wt.Flush()
}
Similarly, if we want to copy a large file you can also use io.copy
this and prevent memory overflow.