Skip to main content

Script to backup Oracle database on tape and ftp server

I had a production Oracle database server which was configured to to backup the full export of database to a tape drive connected to the server. The vendor had configured it and it was working for some time. As it usually happens the server was working fine and we “thought” the backup was working fine. I recently checked the backup as part of the implementation of disaster recovery and found it not working for several months. Hence I wanted a script that is simple and takes backup in two different locations so that I am safe if any disaster happens. I Googled and found a lot of scripts from several guys especially Cyberciti.
I thank you guys for publishing the knowledge that you acquired for helping others. So I feel that I also should give back to the world what I acquired from you.
Below is the script that I have created from several websites. It is self explanatory, but please do tweak as per your needs. In case of any issues please do get in touch with me so that I can help you.
Assume that you will be having basic knowledge of Linux, Linux scripts, Oracle etc. I don’t in not compressing and complicating the backup which in future while restoring may cause decompressing issues. Moreover my database was more than 7.5GB.

================================Script=====================================
#!/bin/bash
######
######Script for full export backup of DB to Tape and FTP
######Created by Anil Kumar
######

########Variable declaration#######
USERNAME="ftpusername"
PASSWORD="ftppassword"
SERVER="ip.add.re.ss"
FOLDER="/path/to/tempworkingfolder"
export ORACLE_SID=oraclesid
export ORACLE_HOME=/path/to/oracle/home
export CURR_DATE=`/bin/date +%d%m%y_%H%M%S`
PATH=$PATH:$ORACLE_HOME/bin;export PATH

########Oracle Export#######
exp dbauser/dbsuserpass file=/path/to/tempworkingfolder/finename_${CURR_DATE}_department.dmp log=/path/to/tempworkingfolder/filename_${CURR_DATE}_department.log compress=y statistics=none grants=y full=y

########Status of the tape########
mt -f /dev/st0 status > /path/to/tempworkingfolder/tapestatus_${CURR_DATE}_department.log

########Rewinding the tape########
mt -f /dev/st0 rewind
cd $FOLDER

########Copying files to the tape########
tar -cpf /dev/st0 *.*

########List of files on tape to make sure that the backup works fine########
tar -tvf /dev/st0 > /path/to/tempworkingfolder/backup_${CURR_DATE}_department.log

########Eject the tape, this will indirectly make sure the script worked and may be backup worked########
mt -f /dev/st0 eject

#########FTP Upload#########
cd $FOLDER
ftp -n -i $SERVER <<EOF
user $USERNAME $PASSWORD
mput *
quit
EOF

#########Deleting files in tempworking forlder other than script#########
rm -rf filename* *.log

================================Script=====================================

Comments

Popular posts from this blog

Create Great looking diagrams - Gliffy.com

I was looking for an online diagram creating tool. Came across Gliffy.com where I could create actually great looking diagrams without much of learning curve. The tool is 1. Very easy to use 2. Has drag and drop of elements to your diagram 3. Collaboration (which I have not tested). 4. Works from anywhere on a browser with Internet connection (I used it in Google Chrome on Linux Mint 10) When you click on the "Try it now" a new screen will open as below. You can either create a blank page to create a diagram from scratch or select from a bunch of samples and work on them to create one for your requirement. There are options to export the diagrams you created to SVG, JPEG, PNG and XML format once you register for free on the website. Once you register you can save the diagrams that you created in the site itself and retrieve it at a later time by logging into the website. I made three diagrams and it was real easy and simple. The site has two kind of pa

Kerio Connect - SQLLite journal.db error

Today I noted that the Keio Connect mail server had thousands of error.log files with 75MB size filled up in the server. As always Google gave the answer. The error looks some what like below. [10/Aug/2011 10:49:35] SQLiteDbWriteCache.h: write_thread - file '/path/to/file/<user>/.journal.db', SQLite error: code 1, error SQLITE_ERROR[1]: no such table: journal_temp The solution is as below 1. Delete the cache from the client's workstation 2. Delete the profile 3. Stop the server 4. Go to the server and navigate through the user's store folder 5. Delete the .journal.db (for Linux it is  find . -name "FILE-TO-FIND"-exec rm -rf {} \; ) 6. Start the server 7. Create a new profile for the clients workstation With help from Kerio Forum Cyberciti