Upload artifacts to AWS S3

This document can be used when you want to upload files to AWS s3

Step-by-step guide

Execute the following steps:

  1. Install ruby with the following commands in data machine where backup will be stored
    gpg2 –keyserver hkp://keys.gnupg.net –recv-keys 409B6B1796C275462A1703113804BB82D39DC0E3
    sudo \curl -L https://get.rvm.io | bash -s stable –ruby
    source /home/compose/.rvm/scripts/rvm
    rvm list known   #####This command will show available ruby versions
    You can install the version of your choice by the following command:
    rvm install ruby 2.3.0  ###Where 2.3.0 is ruby version to be installed
    You can install latest ruby version by the following command:
    rvm install ruby –latest
    Check the version of ruby installed by:
    ruby -v
  2. Check if ruby gem is present in your machine: gem -v
  3. If not present install by sudo yum install ‘rubygems’
  4. Then install aws-sdk:  gem install aws-sdk
  5. Add the code as below in a file upload-to-s3.rb:
    # Note: Please replace below keys with your production settings
    # 1. access_key_id
    # 2. secret_access_key
    # 3. region
    # 4. buckets[name] is actual bucket name in S3require ‘aws-sdk’

    def upload( file_name, destination, directory, bucket)

    destination_file_name = destination

    puts “Creating #{destination_file_name} file…. “

    # Zip cloudsoft persisted folder
    `tar -cvzf #{destination_file_name} #{directory}`

    puts “Created #{destination_file_name} file… “

    puts “uploading #{destination} file to aws…”
    ENV[‘AWS_ACCESS_KEY_ID’]=’Your key here’
    ENV[‘AWS_SECRET_ACCESS_KEY’]=’Your secret here’
    ENV[‘AWS_REGION’]=’Your region here’

    s3 = Aws::S3::Client.new(
    )

    File.open(destination_file_name, ‘rb’) do |file|
    s3.put_object(bucket: ‘bucket_name’, key: file_name, body: file)
    end
    #@s3 = Aws::S3::Client.new(aws_credentials)
    #@s3_bucket = @s3.buckets[bucket]
    #@s3_bucket.objects[file_name].write(file:destination_file_name)

    puts “uploaded #{destination} file to aws…”

    puts “deleting #{destination} file…”
    `rm -rf #{destination}`
    puts “deleted #{destination} file…”

    end

    def clear(nfsLoc)

    # Removing all existing .tar.zip file from folders
    nfsLoc.each_pair do |key, value|
    puts “deleting #{key} file…”
    Dir[“#{key}/*.tar.gz”].each do |path|
    puts path
    `rm -rf #{path}`
    end

    puts “deleted #{key} file…”
    end
    end

    def start()

    nfsLoc = {‘/backup_dir’ => ‘bucket_name/data’}

    nfsLoc.each_pair do |key, value|
    puts “#{key} #{value}”

    Dir.glob(“#{key}/*”) do |dname|
    filename = ‘%s.%s’ % [dname, ‘tar.gz’]

    file = File.basename(filename)
    folderName = File.basename(dname)
    bucket = ‘%s/%s’ % [“#{value}”, folderName]

    puts “….. Uploading started for %s file to AWS S3 …..” % [file]
    t = ‘%s/’ % dname
    puts upload(file, filename, t, bucket)

    puts “….. Uploding finished for %s file to AWS S3 …..” % [file]
    end
    end
    end

    start()

  6. After that execute the following:
    ruby upload-to-s3.rb
  7. If adding to jenkins job add the following line in pre-build script:
    source ~/.rvm/scripts/rvm
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s