Tag Archives: cloud storage

Simple way to Backup Files from an Ubuntu Server to Amazon S3 Cloud Tips, Tricks and Tutorials 17 MAY 2016

You can never have too many backups, and this is a simple way of backing up files from an Ubuntu server to the Amazon S3 cloud storage system.


For this, you will obviously need an Ubuntu server, an Internet connection, and of course, an Amazon AWS account.

First things first, you’ll need to generate Amazon AWS access keys, which you do from the AWS Security Credentials page (Access Keys section) in the AWS console.

Write these (both the Access Key ID and Secret Access Key) down somewhere safe, because you definitely don’t want to be losing them unnecessarily. (Maybe a Google Doc might be a good idea here?)

Now head over to the S3 Management page in the AWS console, where you will need to create the new bucket (or folder in an existing bucket) where you want to store your backed up files in.

With your bucket created, and your access details at hand, head into your Ubuntu server and install the super useful Amazon S3 targeted s3cmd package:

sudo apt-get install s3cmd

Next configure it by entering the requested information (your Access Key details will be needed here). Note, you do have the option to encrypt the files in transit, and if you choose to do so, it is probably worth your while to jot down the password in that previously mentioned Google Docs file of yours!

s3cmd --configure

Run the connection test and if everything passes, you should be good to go. You can check your current buckets by doing a directory listing with s3cmd:

s3cmd ls

You are pretty much just about there now. To do the file backup, we’ll use s3cmd’s built in sync command. To push files to Amazon S3, we declare the parameters in the order of local files then target directory. So for example, if we have a S3 bucket called server-backup, and want to back up our user account’s home directory to S3, the sync call would look like this:

s3cmd sync ~/* s3://server-backup

You can of course get all clever and target specific folders, exclude or include files and folders using wildcard characters, etc. (See the documentation for more). For example, here I exclude .svn folder files using:

s3cmd sync --exclude '*.svn' ~/* s3://server-backup

If you are happy with the sync result, then all that is left is to throw the command into a short bash file, give it execute rights and add it to the cron scheduled tasks system. So for example, create the file cron_s3_backup.sh:

nano /home/craiglotter/cron_s3_backup.sh

Add this text:

s3cmd sync /home/craiglotter/* s3://server-backup/craiglotter/

Save, and make the file executable:

chmod +x ~/cron_s3_backup.sh

Finally, add it to the cron in the usual manner. Open the crontab for editing:

crontab -e

Add the following line for a daily backup at 07:00 in the morning.

0 7 * * * bash /home/craiglotter/cron_s3_backup.sh >/dev/null 2>&1


Understanding Your Data Storage Options
[Partner Content] 01 SEP 2015

Acquiring extra data storage capacity or more computing power generally meant buying more hardware. As space filled up on drives, more hardware investment was the only real solution. Nowadays, however, there is a less expensive and less disruptive option – virtualization.

What is virtualization?

Virtualization can cover the main computing functions of networking, computing power and storage, and data may be stored locally just as in a hardware-based arrangement.

Virtual storage

More and more IT departments are embracing virtual storage as an ideal way of adding storage facilities without the need to invest in JBOD or other external arrays.

The key benefits are those of cost and flexibility: the costs of buying new hardware, employing more staff to look after it, the extra running costs in terms of power and cooling and possibly the need for more space to accommodate it are removed.

It’s a way of making more use of existing hardware resources by means of either using the cloud or software-defined storage.

The Cloud

Data is stored on servers operated by a third party – a cloud provider. Your own hardware resources aren’t being taken up and your data can be accessed at any time by others in your organization potentially from anywhere, and backups can be made to the cloud if desired.

Software-defined storage (SDD)

Where existing hardware resources are effectively extended by the use of software to manage and pool existing resources. Your data is still held locally – just that increased capacity can be achieved without the need for more hardware.

The Cloud: What is Required?

An account with a cloud services provider – you work with a third party to store your data on their servers.

This can be a potential drawback for some: the idea of sensitive data being stored on third party servers may be a concern and cloud computing can be slower than a local storage set up. Another issue is that if their servers fail, access is lost to applications which leads to downtime, loss of productivity and thus income.

Some elect to use cloud as backup-only which means local storage requirements haven’t necessarily been reduced.

Software-defined storage: What is Required?

Two software systems to control your existing hardware – a hypervisor and a virtual SAN.

Two of the industry leading hypervisors are vSphere from VMware and Hyper-V from Microsoft. They work with x86 type servers so there’s not likely to be a need to invest in new hardware, and virtual SAN systems include SvSAN provider StorMagic – a SAN that integrates with the two hypervisors mentioned above.

Indeed, the above SAN is operational with just two servers – and this includes situations where the IT facility has to serve multiple remote locations. They can be controlled from one point – a very valuable benefit as extra data demands can stem from extra branches and the like opening. Previously this would have necessitated investment in local IT infrastructure.

How easy is it to transition to a virtual SAN?

There will be a learning period many IT enthusiasts should grasp, but with the right virtual SAN vendor support, it will be much quicker and easier than installing new hardware not to mention the downtimes likely involved.

Cloud Computing Concept