Python之requests模块-大文件分片上传

最近在做接口测试时,拿到一个分片上传文件的接口,http接口请求头中的Content-Type为multipart/form-data。需要在客户端将大文件分片成数据块后,依次传给服务端,由服务端还原成大文件,此外,为了确保传输后的数据是完整的,客户端会在分片前,根据原文件生成md5值并被携带在每次的http请求中,服务端在还原文件后会进行校验。

如何使用requests模块,实现上述接口测试的需求呢?首先,需要将问题分解:

  1. requests如何传输Content-Type为multipart/form-data的数据?
  2. 如何根据原文件生成md5值?
  3. 如何将大文件分片成数据块?

本文将逐一为大家解答。

发送multipart/form-data请求

这里需要用到辅助库requests_toolbelt,使用MultipartEncoder类创建一个multipart/form-data类型的data充当请求体。此外,我们请求头中的Content-Type除了multipart/form-data还需生成boundary,如下例所示:

import requests
from requests_toolbelt import MultipartEncoder
import os


def upload_multipart(url, file_path):
    filename = file_path.split("\\")[-1:][0]
    total_size = os.path.getsize(file_path)
    data = MultipartEncoder(
        fields={
            "filename": filename,
            "totalSize": str(total_size),
            "file": (filename, open(file_path, 'rb'), 'application/octet-stream')
        }
    )
    headers = {
        "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.107 Safari/537.36",
        "Accept": "application/json",
        "Accept-Encoding": "gzip, deflate",
        "Connection": "keep-alive",
        "Content-Type": data.content_type
    }
    with requests.post(url, headers=headers, data=data) as response:
        assert response.status_code == 200

根据原文件生成md5值

使用hashlib库,如下例所示:

import hashlib


def get_md5(path):
    m = hashlib.md5()
    with open(path, 'rb') as f:
        for line in f:
            m.update(line)
    md5code = m.hexdigest()
    return md5code

大文件分片成数据块

如下例所示,定义数据块的大小为2MB,根据文件大小划分出数据块的总数量,通过fileObject.seek()函数偏移文件的指针到当前数据块的位置,依次读取数据块并发送请求,每个请求都带上了md5值。

import requests
from requests_toolbelt import MultipartEncoder
import os
import math


def upload_slice_file(url, file_path):
    chunk_size = 1024*1024*2
    filename = file_path.split("\\")[-1:][0]
    total_size = os.path.getsize(file_path)
    current_chunk = 1
    total_chunk = math.ceil(total_size/chunk_size)

    while current_chunk <= total_chunk:
        start = (current_chunk - 1)*chunk_size
        end = min(total_size, start+chunk_size)
        with open(file_path, 'rb') as f:
            f.seek(start)
            file_chunk_data = f.read(end-start)
        data = MultipartEncoder(
            fields={
                "filename": filename,
                "totalSize": str(total_size),
                "currentChunk": str(current_chunk),
                "totalChunk": str(total_chunk),
                "md5": get_md5(file_path),
                "file": (filename, file_chunk_data, 'application/octet-stream')
            }
        )
        headers = {
            "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.107 Safari/537.36",
            "Accept": "application/json",
            "Accept-Encoding": "gzip, deflate",
            "Connection": "keep-alive",
            "Content-Type": data.content_type
        }
        with requests.post(url, headers=headers, data=data) as response:
            assert response.status_code == 200

        current_chunk = current_chunk + 1