Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

修改单个文件大小为百度允许的数值为2G和默认临时文件目录 #38

Closed
wants to merge 1 commit into from

Conversation

1265578519
Copy link

修改单个文件大小为百度允许的数值为2G和默认临时文件目录,减少大文件分块,单文件大小最高支持2TB,临时目录转移到home目录下,防止/目录系统盘不够用导致一系列崩溃问题。

修改单个文件大小为百度允许的数值为2G和默认临时文件目录,减少大文件分块,单文件大小最高支持2TB,临时目录转移到home目录下,防止/目录系统盘不够用导致一系列崩溃问题。
@oott123
Copy link
Owner

oott123 commented Nov 13, 2017

小樱啊,我都跟你说了,默认 /home 写不进去的,你咋这么改呢 😭

@oott123 oott123 closed this Nov 13, 2017
@1265578519
Copy link
Author

..当时发的两个啊,你看三天前的,我本地是这么用的。。不然上传10多G的压缩包加密备份把系统盘堵死了

@oott123
Copy link
Owner

oott123 commented Nov 14, 2017

临时目录可以命令指定的嘛!而且 /home/tmp 大部分用户是没法写的啊,怎么说也应该改成 $HOME/tmp 吧

@1265578519
Copy link
Author

..你的意思是说大部分用户没有root权限么。

@1265578519
Copy link
Author

7-Zip (a) [64] 16.02 : Copyright (c) 1999-2016 Igor Pavlov : 2016-05-21
p7zip Version 16.02 (locale=en_US.UTF-8,Utf16=on,HugeFiles=on,64 bits,64 CPUs x64)

Scanning the drive:
1 folder, 22 files, 14600623056 bytes (14 GiB)

Creating archive: /home/bpcsbackup/2017-11-14~00-05-02.7z

Items to compress: 23

Files read from disk: 22
Archive size: 14600624058 bytes (14 GiB)
Everything is Ok
===========================Baidu PCS Uploader===========================
Usage: ./bpcs_uploader.php init|quickinit|quota
Usage: ./bpcs_uploader.php upload|download path_local path_remote
Usage: ./bpcs_uploader.php delete path_remote
Usage: ./bpcs_uploader.php uploadbig path_local path_remote [slice_size(default:2097152000)] [temp_dir(def:/home/bpcs_uploader-master/tmp/)]
Usage: ./bpcs_uploader.php fetch path_remote path_to_fetch

% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
196 395 131 395 0 195 1418 700 --:--:-- --:--:-- --:--:-- 5882
Uploading file in pieces, 1 out of 7 parts...
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 2000M 0 75 100 2000M 0 488k 1:09:54 1:09:54 --:--:-- 0
Uploading file in pieces, 2 out of 7 parts...
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 2000M 0 75 100 2000M 0 488k 1:09:54 1:09:54 --:--:-- 0
Uploading file in pieces, 3 out of 7 parts...
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 2000M 0 75 100 2000M 0 488k 1:09:55 1:09:55 --:--:-- 0
Uploading file in pieces, 4 out of 7 parts...
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 2000M 0 75 100 2000M 0 488k 1:09:55 1:09:55 --:--:-- 0
Uploading file in pieces, 5 out of 7 parts...
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 2000M 0 75 100 2000M 0 487k 1:09:57 1:09:57 --:--:-- 0
Uploading file in pieces, 6 out of 7 parts...

curl: (35) SSL connect error
API calling faild.

虽然改成2G分块,但是有时候还是出错,虽然几率比以前小了,,但是还是可能失败。

@1265578519
Copy link
Author

这个有什么好的断点解决方案么。

@1265578519
Copy link
Author

话说你认识我--!?

@oott123
Copy link
Owner

oott123 commented Nov 15, 2017

分块应该改小而不是改大,越大越容易上传失败。
每个分块上传完成后应当校验 md5 ,若与本地文件不符的需要重传。

这些都是我之前应当要做的,不过很久没闲心来改几年前的代码了,当年写得太差了……
你要闲着可以改改~ 我会 review 的~

@oott123
Copy link
Owner

oott123 commented Nov 15, 2017

..你的意思是说大部分用户没有root权限么。

是的,写脚本的时候应该假设大部分脚本用户不是在 root 下使用的。最佳实践也是如此,只应当授予尽量少的权限。

@1265578519
Copy link
Author

这个分块只会在一个新的分块请求开始出错,连接中不会出现任何错误,,所以我感觉加大可以降低新分块请求来降低错误←

@oott123
Copy link
Owner

oott123 commented Nov 15, 2017

加个失败重传来解决比较简单彻底。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants