Using the AWS CLI to Copy Files to an S3

Using AWS CLI in Linux

There are three methods to upload and download data to Amazon Web Services. You can use the command line (CLI), AWS SDK, or the S3 REST API. In this article, we will explore the command Line interface, and the most common commands to manage an S3 bucket.

The maximum size of a file that you can upload by using the Amazon S3 console is 160 GB. The maximum bucket size is 5TB. You can not use s3api on files uploads larger than 5GB. Command line tools can achieve upload speeds greater than 7 MB’s. But, you can go even faster if you turn on acceleration. It is not recommended because an additional cost will be incurred.

Common Switches

  • –dryrun = test what files would be uploaded, prior to running command.
  • — summarize = include a total at the bottom of the output.
  • — human-readable = show files sizes in Gb and not Bytes.
  • –output text = format the output on separate lines
  • –content-type=text/plain = Tell aws the upload data is text data (not video or other).
  • –recursive = show full file path
  • –exclude – leave out certain files.
  • –include = include certain files.
  • –delete = this flag is needed to remove any files.

List Contents of a Bucket

Copy a Single File

If the file is large, the cp command will automatically handle a multi-part upload dynamically. If the full path is not present, it will create it automatically in the s3 bucket.

Copy Multiple Files in a Local Directory

There are two commands that can be used to copy multiple files. Use sync or cp with the –recursive switch.

OR

Copy only Files with .sum Extension

Copy Directory and Exclude Two Files

Related Posts