Everyone knows that uploading large files on the web has always been a pain. Upload file size limit, page response time timeout. These are all web development must face directly.
The solution given in this article is: the front-end implements data stream fragmentation and long transmission, and then merges files after receiving.
To achieve folder upload, requirements: the server retains a hierarchical structure and supports 10w-level folder upload.
Large file upload and breakpoint resuming, requirements: support 50G-level single file uploading and resuming. Resuming requirements: You can resume uploading after you refresh the browser, you can continue uploading after restarting the browser (close and reopen the browser), and you can continue uploading after restarting the computer.
Support PC platform, Windows, Mac, Linux
The browser requires support for all browsers, including ie6, ie7, ie8, ie9, Chrome, Firefox
The following is a simple DEMO source code sharing:
Front page:
Backend code, this Demo is based on MVC architecture:
Instead of putting all the modules in one class on the Internet, I did a modularization of the back-end code, which would be a disaster for future maintenance and upgrades.
File block processing logic
The file block saving logic is as follows
web.xml configuration is as follows
Screenshot of the entire project
The dependent JAR packages are as follows
The running effect is as follows:
Share it here! I hope to correct me a lot ~
The back-end code logic is mostly the same, and currently supports MySQL, Oracle, SQL. Before using, you need to configure the database, you can refer to this article I wrote: http://blog.ncmem.com/wordpress/2019/08/07/java huge file upload and download /
Welcome to join the group to discuss: 374992201