how to upload and download large amounts of data programatically
回答済みHi,
I do research at a University and need to move large amounts of data (100-500Gb) between a cluster and a box.com account regularly. I need to be able to do this programmatically, but I can't seem to make sense of the tutorials that are on the box.com website. I'm reasonably fluent with a few programming languages, but I'm having a hard time figuring out:
1. which type of authorization to use
2. how to generate the appropriate keys
3. which keys go into which arguments of a curl (or other) command
To be clear, I'm not in any of the complicated scenarios given as examples on the developer tutorials. My situation is:
I have a box.com account
I need to be able to copy folders (100-500Gb) recursively on a regular basis between a linux based cluster and my box.com account
I found many messageboard conversations that link to the same tutorials that are giving me trouble. As a starting point, can anyone recommend which type of authorization I should be using given my situation?
Thanks,
Steve
-
One afterthought: I've already tried using curl directly (not through the API) and gotten 551 errors for some files at random (i.e. I can't figure out why some are generating errors and other aren't). So I thought I would try to get the API up and running to see if it works more consistently.
-
Hi Can you point me to any tutorial or outline your steps?
I seen no one replied to your post but you figured it out eventually. I'm in the need to do the same.
In the meanwhile, I'm downloading each file individually to my Mac and then transferring to the university cluster using FileZilla - very tedious and time consuming.
サインインしてコメントを残してください。
コメント
4件のコメント