paramiko报错 Garbage packet received

Before information Overview

python script to upload the code today want to write more than one process to the server, so the local virtual machine test it, you can always being given, the following specific error message

Traceback (most recent call last):
  File "D:\python3.6.7\lib\multiprocessing\process.py", line 258, in _bootstrap
    self.run()
  File "D:\python3.6.7\lib\multiprocessing\process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "D:\Documents\education-server\fabfile.py", line 88, in upload
    sftp.put(local_path, target_path, confirm=True)
  File "D:\python3.6.7\lib\site-packages\paramiko\sftp_client.py", line 759, in put
    return self.putfo(fl, remotepath, file_size, callback, confirm)
  File "D:\python3.6.7\lib\site-packages\paramiko\sftp_client.py", line 717, in putfo
    reader=fl, writer=fr, file_size=file_size, callback=callback
  File "D:\python3.6.7\lib\site-packages\paramiko\util.py", line 301, in __exit__
    self.close()
  File "D:\python3.6.7\lib\site-packages\paramiko\sftp_file.py", line 82, in close
    self._close(async_=False)
  File "D:\python3.6.7\lib\site-packages\paramiko\sftp_file.py", line 104, in _close
    self.sftp._request(CMD_CLOSE, self.handle)
  File "D:\python3.6.7\lib\site-packages\paramiko\sftp_client.py", line 813, in _request
    return self._read_response(num)
  File "D:\python3.6.7\lib\site-packages\paramiko\sftp_client.py", line 843, in _read_response
    t, data = self._read_packet()
  File "D:\python3.6.7\lib\site-packages\paramiko\sftp.py", line 205, in _read_packet
    raise SFTPError("Garbage packet received")
paramiko.sftp.SFTPError: Garbage packet received

Search online for a long time but could not find the answer until you see this come to think of its own virtual machine linux seems to set up a time synchronization background process in ~ / .bashrc, each terminal will synchronize into the linux one time
then commented out this configuration, run again on OK.

can not pickle _thread.lock objects
also have a problem that is not multi-process parameters are custom objects, otherwise there will be the following error

... ... ...
TypeError: can't pickle _thread.lock objects

The reason for this is because I passed a custom object parameter function in multiple processes running in the lead, just to write a custom object can function
before the amendment

p1 = Process(target=ssh_obj.upload, args=("192.168.129.10", "admin", "aa.jar", "/root/aa.jar"))

Modified

p1 = Process(target=upload, args=("192.168.129.10", "admin", "aa.jar", "/root/aa.jar"))    # 重写一个函数,将对象放入函数中

Guess you like

Origin www.cnblogs.com/welisit/p/11446421.html