✕ סגור 
צור קשר
תודה על ההתעניינות .

Thank you! Your submission has been received!

Oops! Something went wrong while submitting the form

Linux-Client Backup to Remote Host & Cloud

Adam Russak
|
Jun 26, 2018
alt="blogs"
alt="blogs"
Events
title="Google"
alt="blogs"
Event

לאחרונה קיבלתי משימה שכללה גיבוי דאטה קריטי במכונה וירטואלית של לינוקס (Ubuntu 14 serverׂ). התרחיש: LINUX VM צריך להיות מגובה על בסיס יומי - גיבוי מלא אחת לשבוע וביתר הפעמים גיבוי אינקרמנטלי (רק של הדאטה שהשתנה מהפעם הקודמת). על כל הגיבויים להתבצע ON-PREM ו- OFF-SITE.

תוך כדי התהליך נוכחתי לדעת כי ישנן דרכים רבות לביצוע גיבוי, אך לא מצאתי את דרך לבצע זאת בצורה שתספק את הצרכים שלי לחלוטין. לאחר שביצעתי מחקר קצר, ובאמצעות ניסוי וטעיה – זהו הפתרון אליו הגעתי.

Contents

Introduction .......... 2

The Scenario ....................2

The backup I needed to make was: .......... 2

The Limitations are: .......... 2

My Solution .......... 3

Flow Chart .......... 3

1) Backup-Ninja .......... 3

2) rdiff-backup .......... 3

3)rclone .......... 4

Pre-Requisite .......... 5

“Backup-Host” Installation: .......... 5

“Target-VM” Installation: .......... 6

Setting up the Backup Plan: .......... 6

Creating an SA JSON File in GCP ..........14

Setting up rclone .......... 18

Pre-requisite .......... 18

Configurations: ..........18

Building the Backup Script .......... 21

Dry-Testing the Backup-Task .......... 22

Testing Backup-ninja task .......... 22

Testing the rclone Configuration .......... 23

Testing the Script .......... 23

Placing the Task in Crontab .......... 24

Resources .......... 24

Introduction

recently I was tasked with backing up a critical data on a Linux VM (Ubuntu 14 server), as I

went at it I found out there are many ways to backup, but I didn't find "The One" that will give

me the solution I need. After some research and trial and errors, this is the solution I ended up

using, hope it will help somebody.

The Scenario

The backup I needed to make was:

a Linux VM on-prim that needed to be backed up daily: 1 full every week and other backups

incremental. All the backups need to be both on-prem and off-site.

The Limitations are:

1. As little data Transfer as possible

2. as little downtime as possible

3. Fast and reliable Recovery.

My Solution

Flow Chart

So what did I do? Good question! I used 3 different solutions to build 1 good one (for me!).

1) Backup-Ninja

a backup manager that has many great qualities, but what I needed was: a simple setup, clear outcome and a great understanding of the Backup Procedures. it is a "manager" because it doesn't actually back up anything, the Program passes commands: "how" "when" "what" and so forth to another Program (rdiff-backup in my case).

2) rdiff-backup

a backup service that allows doing many nice things, such as incremental backup to the Remote Host. And a great feature i loved in that program is that you can restore it by a command to each backup point you have saved (E.x- if you want the 6 days ago and not last on).

3) rclone

This service is an open source that allows you to connect and transfer data from your host to a bucket in the Cloud, in my case it allows the remote host to do a sync to GCP. I have configured it to sync the backup location to a GCP bucket I created (i won't address this part here).

Pre-Requisite

“Backup-Host” Installation:

1. create a VM/Host to serve as a "Backup-Host" (we created an Ubuntu 16)

2. install Open-SSH on “Backup-Host” 

1. sudo apt install openssh-server –y

3. After the installation we can start working with a ssh clint (I use puTTY)

4. install rclone on the "Backup-Host" 

1. curl https: //rclone.org/install.sh | sudo bash

5. creating backup location directories in "Backup-Host" 

1. mkdir Backup && mkdir Backup/ubu14-t

6. installing rdiff-backup 

1. sudo apt-get update && sudo apt-get install librsync-dev rdiff-backup -y

“Target-VM” Installation:

1. install ninja-Backup on the "Target-VM" 1. apt-get update && apt-get install backupninja duplicity rdiff-backup python-paramiko python-gobject–2 –y

2. when the Mail Option Prompt: press local-only

3. install Open-SSH on “Backup-Host”

Setting up the Backup Plan:

4. creating SSH keys for Auto Connection of “Backup-Host” to “Target-VM”

4.1. on “Target-VM” create SSH-keys: 

1. ssh-keygen

4.2. after creation send the keys to the Backup-Host 

1. ssh-copy-id adam@<backup-server-ip/hostname>

4.3. when prompted click yes

4.4. enter the password when asked

5. checking the connection 

6. ssh adam@<backup-server-ip/hostname>

6.1. if it worked, no password will be asked

7. Creating a backup Task in Ninja helper GUI.

7.1. All Ninja task should be done under sudo 

1. Sudo Ninjahelper

7.2. Select new

7.3. Select your appropriate method of backup/copy (we will use rdiff for incremental backup)

7.4. Select src to start the Configuration of your backup plan

7.5. In src we will set the path to the folders we want to backup (we will backup the FilesToBackup folder in /home/adam/)

7.6. Press OK

7.7. In the next window we will configure what files or folders to ignore (you can add and remove as needed)

7.8. Press OK when done

7.9. We will be prompted with the menu again (this time src will be marked Done), chose dest

8. In dest windows, we will set all the Remote host settings

8.1. Keep: how far back the incremental will be saved (we will set to 30d)

8.2. Dest_directory: the location you want your backup to be saved at (we will set it to the location we created earlier /home/adam/Backup/ubu14-t/)

8.3. Dest_host: the IP or FQDN

8.4. Dest_user: the user credentials we will use to connect to the Backup-Host (we will use adam as we created the ssh for that user earlier)

8.5. Dest_type: we will leave it as remote

8.6. Press OK

9. Press conn to setup the ssh to remote host (we already did, so this will just check that it’s all good to go)

10. In the next window we will press OK to allow backup-ninja to check the connection that we configured and set a new ssh key for it.

11. We have received an error that states that we need permissions to the dest_folder

12. On Backup-Host run: 

1. Sudo chown –R adam:root Backup/

(This will change the folder owner to adam, the –R will let this configuration run recursively in the folder)

13. After the folder owner changes we will try to connect again (if done right this will be your output)

14. Press Finish

15. Now we have the Task listed, if we want to make changes/ edited the task we can press that task

16. To Add a Flags to the Task Select our task form the menu

17. In this menu chose the ‘xedit’ to edite our backup configuration task

18. An external editor will open with the task configuration’s

19. We will add our Flag at the Option part & remove the # (--preserve-numerical-ids Flag to the Backup-Ninja Task)

•The Flag is different for each task you will select (we used rdiff-backup)

Creating an SA JSON File in GCP

1. Connect to your GCP account (we will do it via the web console but it is posable to do it via the SDK)

2. Got to IAM & admin > Service accounts

3. Press ‘CREAT SERVICE ACCOUNT’

4. File all Details:

4.1. Service Account Name: the name of your SA certificate

4.2. ROLE: what permissions will this SA have (we will give it GCP Storage Object admin)

4.3. Chose to download the JSON file (a privet key of the SA)

5. If all was done OK you will receive the JSON file and the Following message:

6. Go to storage: Menu (on the right) > Storage

7. Create a New Bucket.

8. Give your bucket a name and chose its storage plane & Location ( we will use ‘Nearline’ & US Bucket)

9. After we created the new Bucket, we need to create our destination Directory (we called it: Linux-dest-Bucket)

Setting up rclone

Pre-requisite

To create a GCP user on rclone you will need to collect the Following details:

1. Project ID: testing-adam

2. Project Number: 14699330635992

3. Transfer JSON File to Backup-Host

3.1. Open WinSCP and login to Backup-Host

3.2. Transfer JSON File to a Predetermined Location (we will place it at /home/adam/Backup)

Configurations:

1. Start rclone on the Backup-Host

1. rclone config

2. We will create a new user and give it a Name (we will name the user host2gcp)

3. First We will configure our user for our backup plan (we will use a GCP Bucket)

4. In the next step we will press enter and leave the Client Id empty

5. The Client Secret we will leave Empty as well

6. When prompted for the Project number we will enter the Number: (14699330635992)

7. When asked for the JSON file Path we will give them the path where we saved the JSON file (/home/adam/Backup/Testing-Adam-080a0a91846d.json)

8. When asked for object_acl leave empty (it will use 4 as default)

9. When asked for Bucket_acl leave empty (it will use 2 as default)

10. When asked for storage location press 4 (4= US, use the location of your bucket)

11. For storage Class we will use 4 (NEARLINE)

12. The final step is to review the Configuration’s, check all is well and if it is, press Y

Building the Backup Script

• Our script will have only 2 lines, but every system and/or service need its own set of commands (in example- certain APP’s need to stop services for a Functional Backup, so we will need to stop and after the service start them Again)

1. Open a new text file with a text editor in Target-VM: 

$nano backup-script.sh

2. nter our script commands:

 #!/bin/bash $sudo backupninja –d --run /etc/backup.d/90.rdiff $ssh adam@192.168.47.133 "rclone sync /home/adam/Backup/ubu14-t host2gcp:adam-backup/Linux-dest-Bucket/ -v"

3. Save the Script and exit.

4. To finish the script we will change its permissions from the default of 664 to 760 

$sudo chmod 760 backup-script.sh

Dry-Testing the Backup-Task

We will test each stage to see that it works properly and after we test them we will run the task to see that all works together.

Testing Backup-ninja task

1. To test Backup-ninja we will run a ‘do a test run’ form ninjahelper GUI.

2. If run successfully you will reactive a bottom line with 1 action ran and no errors.

Testing the rclone Configuration

1. To test rclone we will run the sync command with the ‘—dry-run’ Flag

2. If the task is successfully you will get a sum of the files that would have been uploaded and no errors.

Testing the Script

1. Testing the script will be the first “live” run we are making, as it will execute the script with no Flags.

2. In our Target-VM go to the Folder where the script is located

$cd/home/adam/FilesToBackup/

3. Execute the Script.

$./backup-script.sh

4. If successful you will see the same output of the previous test ran 1 after the other.

Placing the Task in Crontab

After we checked our plan, we will place it in Crontab to run automatically as we need (we will make it a daily task)

1. On Target-VM go to sudo Crontab.

sudo crontab –e

2. If it’s your first time you will need to choose an editor (We go with nano)

3. In Crontab, go to the bottom and add the new task (we will make the task to run every morning at 5 Am every day)

$0 5 * * * /home/adam/FilesToBackup/backup-script.sh

4. Save and exit the editor.

Resources

http://bit.ly/2I4nXSn

https://rclone.org/install/

http://bit.ly/2KlqM2o

Contents

Introduction .......... 2

The Scenario ....................2

The backup I needed to make was: .......... 2

The Limitations are: .......... 2

My Solution .......... 3

Flow Chart .......... 3

1) Backup-Ninja .......... 3

2) rdiff-backup .......... 3

3)rclone .......... 4

Pre-Requisite .......... 5

“Backup-Host” Installation: .......... 5

“Target-VM” Installation: .......... 6

Setting up the Backup Plan: .......... 6

Creating an SA JSON File in GCP ..........14

Setting up rclone .......... 18

Pre-requisite .......... 18

Configurations: ..........18

Building the Backup Script .......... 21

Dry-Testing the Backup-Task .......... 22

Testing Backup-ninja task .......... 22

Testing the rclone Configuration .......... 23

Testing the Script .......... 23

Placing the Task in Crontab .......... 24

Resources .......... 24

Introduction

recently I was tasked with backing up a critical data on a Linux VM (Ubuntu 14 server), as I

went at it I found out there are many ways to backup, but I didn't find "The One" that will give

me the solution I need. After some research and trial and errors, this is the solution I ended up

using, hope it will help somebody.

The Scenario

The backup I needed to make was:

a Linux VM on-prim that needed to be backed up daily: 1 full every week and other backups

incremental. All the backups need to be both on-prem and off-site.

The Limitations are:

1. As little data Transfer as possible

2. as little downtime as possible

3. Fast and reliable Recovery.

My Solution

Flow Chart

So what did I do? Good question! I used 3 different solutions to build 1 good one (for me!).

1) Backup-Ninja

a backup manager that has many great qualities, but what I needed was: a simple setup, clear outcome and a great understanding of the Backup Procedures. it is a "manager" because it doesn't actually back up anything, the Program passes commands: "how" "when" "what" and so forth to another Program (rdiff-backup in my case).

2) rdiff-backup

a backup service that allows doing many nice things, such as incremental backup to the Remote Host. And a great feature i loved in that program is that you can restore it by a command to each backup point you have saved (E.x- if you want the 6 days ago and not last on).

3) rclone

This service is an open source that allows you to connect and transfer data from your host to a bucket in the Cloud, in my case it allows the remote host to do a sync to GCP. I have configured it to sync the backup location to a GCP bucket I created (i won't address this part here).

Pre-Requisite

“Backup-Host” Installation:

1. create a VM/Host to serve as a "Backup-Host" (we created an Ubuntu 16)

2. install Open-SSH on “Backup-Host” 

1. sudo apt install openssh-server –y

3. After the installation we can start working with a ssh clint (I use puTTY)

4. install rclone on the "Backup-Host" 

1. curl https: //rclone.org/install.sh | sudo bash

5. creating backup location directories in "Backup-Host" 

1. mkdir Backup && mkdir Backup/ubu14-t

6. installing rdiff-backup 

1. sudo apt-get update && sudo apt-get install librsync-dev rdiff-backup -y

“Target-VM” Installation:

1. install ninja-Backup on the "Target-VM" 1. apt-get update && apt-get install backupninja duplicity rdiff-backup python-paramiko python-gobject–2 –y

2. when the Mail Option Prompt: press local-only

3. install Open-SSH on “Backup-Host”

Setting up the Backup Plan:

4. creating SSH keys for Auto Connection of “Backup-Host” to “Target-VM”

4.1. on “Target-VM” create SSH-keys: 

1. ssh-keygen

4.2. after creation send the keys to the Backup-Host 

1. ssh-copy-id adam@<backup-server-ip/hostname>

4.3. when prompted click yes

4.4. enter the password when asked

5. checking the connection 

6. ssh adam@<backup-server-ip/hostname>

6.1. if it worked, no password will be asked

7. Creating a backup Task in Ninja helper GUI.

7.1. All Ninja task should be done under sudo 

1. Sudo Ninjahelper

7.2. Select new

7.3. Select your appropriate method of backup/copy (we will use rdiff for incremental backup)

7.4. Select src to start the Configuration of your backup plan

7.5. In src we will set the path to the folders we want to backup (we will backup the FilesToBackup folder in /home/adam/)

7.6. Press OK

7.7. In the next window we will configure what files or folders to ignore (you can add and remove as needed)

7.8. Press OK when done

7.9. We will be prompted with the menu again (this time src will be marked Done), chose dest

8. In dest windows, we will set all the Remote host settings

8.1. Keep: how far back the incremental will be saved (we will set to 30d)

8.2. Dest_directory: the location you want your backup to be saved at (we will set it to the location we created earlier /home/adam/Backup/ubu14-t/)

8.3. Dest_host: the IP or FQDN

8.4. Dest_user: the user credentials we will use to connect to the Backup-Host (we will use adam as we created the ssh for that user earlier)

8.5. Dest_type: we will leave it as remote

8.6. Press OK

9. Press conn to setup the ssh to remote host (we already did, so this will just check that it’s all good to go)

10. In the next window we will press OK to allow backup-ninja to check the connection that we configured and set a new ssh key for it.

11. We have received an error that states that we need permissions to the dest_folder

12. On Backup-Host run: 

1. Sudo chown –R adam:root Backup/

(This will change the folder owner to adam, the –R will let this configuration run recursively in the folder)

13. After the folder owner changes we will try to connect again (if done right this will be your output)

14. Press Finish

15. Now we have the Task listed, if we want to make changes/ edited the task we can press that task

16. To Add a Flags to the Task Select our task form the menu

17. In this menu chose the ‘xedit’ to edite our backup configuration task

18. An external editor will open with the task configuration’s

19. We will add our Flag at the Option part & remove the # (--preserve-numerical-ids Flag to the Backup-Ninja Task)

•The Flag is different for each task you will select (we used rdiff-backup)

Creating an SA JSON File in GCP

1. Connect to your GCP account (we will do it via the web console but it is posable to do it via the SDK)

2. Got to IAM & admin > Service accounts

3. Press ‘CREAT SERVICE ACCOUNT’

4. File all Details:

4.1. Service Account Name: the name of your SA certificate

4.2. ROLE: what permissions will this SA have (we will give it GCP Storage Object admin)

4.3. Chose to download the JSON file (a privet key of the SA)

5. If all was done OK you will receive the JSON file and the Following message:

6. Go to storage: Menu (on the right) > Storage

7. Create a New Bucket.

8. Give your bucket a name and chose its storage plane & Location ( we will use ‘Nearline’ & US Bucket)

9. After we created the new Bucket, we need to create our destination Directory (we called it: Linux-dest-Bucket)

Setting up rclone

Pre-requisite

To create a GCP user on rclone you will need to collect the Following details:

1. Project ID: testing-adam

2. Project Number: 14699330635992

3. Transfer JSON File to Backup-Host

3.1. Open WinSCP and login to Backup-Host

3.2. Transfer JSON File to a Predetermined Location (we will place it at /home/adam/Backup)

Configurations:

1. Start rclone on the Backup-Host

1. rclone config

2. We will create a new user and give it a Name (we will name the user host2gcp)

3. First We will configure our user for our backup plan (we will use a GCP Bucket)

4. In the next step we will press enter and leave the Client Id empty

5. The Client Secret we will leave Empty as well

6. When prompted for the Project number we will enter the Number: (14699330635992)

7. When asked for the JSON file Path we will give them the path where we saved the JSON file (/home/adam/Backup/Testing-Adam-080a0a91846d.json)

8. When asked for object_acl leave empty (it will use 4 as default)

9. When asked for Bucket_acl leave empty (it will use 2 as default)

10. When asked for storage location press 4 (4= US, use the location of your bucket)

11. For storage Class we will use 4 (NEARLINE)

12. The final step is to review the Configuration’s, check all is well and if it is, press Y

Building the Backup Script

• Our script will have only 2 lines, but every system and/or service need its own set of commands (in example- certain APP’s need to stop services for a Functional Backup, so we will need to stop and after the service start them Again)

1. Open a new text file with a text editor in Target-VM: 

$nano backup-script.sh

2. nter our script commands:

 #!/bin/bash $sudo backupninja –d --run /etc/backup.d/90.rdiff $ssh adam@192.168.47.133 "rclone sync /home/adam/Backup/ubu14-t host2gcp:adam-backup/Linux-dest-Bucket/ -v"

3. Save the Script and exit.

4. To finish the script we will change its permissions from the default of 664 to 760 

$sudo chmod 760 backup-script.sh

Dry-Testing the Backup-Task

We will test each stage to see that it works properly and after we test them we will run the task to see that all works together.

Testing Backup-ninja task

1. To test Backup-ninja we will run a ‘do a test run’ form ninjahelper GUI.

2. If run successfully you will reactive a bottom line with 1 action ran and no errors.

Testing the rclone Configuration

1. To test rclone we will run the sync command with the ‘—dry-run’ Flag

2. If the task is successfully you will get a sum of the files that would have been uploaded and no errors.

Testing the Script

1. Testing the script will be the first “live” run we are making, as it will execute the script with no Flags.

2. In our Target-VM go to the Folder where the script is located

$cd/home/adam/FilesToBackup/

3. Execute the Script.

$./backup-script.sh

4. If successful you will see the same output of the previous test ran 1 after the other.

Placing the Task in Crontab

After we checked our plan, we will place it in Crontab to run automatically as we need (we will make it a daily task)

1. On Target-VM go to sudo Crontab.

sudo crontab –e

2. If it’s your first time you will need to choose an editor (We go with nano)

3. In Crontab, go to the bottom and add the new task (we will make the task to run every morning at 5 Am every day)

$0 5 * * * /home/adam/FilesToBackup/backup-script.sh

4. Save and exit the editor.

Resources

http://bit.ly/2I4nXSn

https://rclone.org/install/

http://bit.ly/2KlqM2o

לפרטים נוספים ויצירת קשר עם נציג אורקל

תודה הודעתך התקבלה

הודעתך לא התקבלה - נסה שוב מאוחר יותר

Adam Russak

הירשם לרשימת הדיוור של IsraelClouds

Thank you! Your submission has been received!

Oops! Something went wrong while submitting the form

מילון מונחיםהשירותים שלנו תנאי שימושהרשמה לניוזלטרמדיניות פרטיות