I would like to schedule PostgreSQL database backups. To configure that I should take pgAgent and pgAdmin3 I think, but I’m struggeling on installing pgAgent on the Servers (Mac OSX 10.6.x/Debian 5).
In the documentation of pgAgent/pgAdmin I found:
“Open pgAdmin > connect to the postgres database > open the SQL tool > ‘Open’ option from the file menue > find the ‘pgagent.sql’ > click ‘Run’ if the Script is loaded”
The pgagent.sql is not installed by the EnterpriseDB PostgreSQL 9.0.1 .dmg/.tar.gz. I couldn’t find anything about pgagent in the installation direction.
I have downloaded the pgAgent-3.0.0-Darwin.tar.gz from the pgAdmin website for the Mac and unziped the file.
Now I repeated the steps above, loaded the pgagent.sql Sript into the SQL tool and executed it by Run.
The pgAgent should now installed as Schema in the postgres database.
But I cannot find pgAgent there and when I a reinstall I get the message that pgagent is already installed!?
–
Ahh…, now I have a node ‘Jobs’ in the treeview! A rightclick on it opens a pgAgent Job dialog. Cool, but how can I configure a backup routine and setting the path to the backup directory? In the Job dialog I can set “Routine Maintenance”, “Data Import”, “Data Export”, Data Summarisation" and “Miscellaneous”.
–
Ok, I googled that I need a shell script for the backup routine, stored in any directory and executed by the pgAgent Job setting. What I found is a script like this:
#!/bin/bash #backup directory can be a file server share that the PgAgent daemon account has access to
BACKUPDIR=“/path/to/backup”
PGHOST=“localhost”
PGUSER=“postgres”
PGBIN=“/usr/bin”
thedate=date --date="today" +%Y%m%d%H
themonth=date --date="today" +%Y%m
#create a full backup of the server databases
$PGBIN/pg_dumpall -h $PGHOST -U $PGUSER | gzip > $BACKUP_DIR/fullbackup-$themonth.sql.gz
#put the names of the databases you want to create an individual backup below
dbs=(db1 db2 db3) #iterate thru dbs in dbs array and backup each one
for db in ${dbs[@]}
do
$PGBIN/pg_dump -i -h $PG_HOST -U $PGUSER -F c -b -v -f $BACKUPDIR/$db-$thedate.compressed $db
done
#this section deletes the previous month of same day backup except for the full server backup
rm -f $BACKUPDIR/date --date="last month" +%Y%m%d.compressed
But this is not running because the date format should be changed. How do I get this to work?
I don’t use PgAgent to schedule my backups. I simply use launchd for this (on Mac) or cron on Linux.
Launchd then triggers the following shellscript that backs-up each database in the cluster (excl. the default postgres db) and the global settings (roles, groups, etc.) into separate backup files. Having them separate makes it easier to do a restore per database.
Adjust the used parameters accordingly:
#!/bin/bash
#
# This script will dump all databases of the PostgreSQL cluster (excl. template1, template0 and postgres)
# into a directory.
# After all databases are dumped then it will dump the global data like users/roles/etc.
#
# Backup times are logged in a log file in the same backup directory.
#
# NOTE and WARNING: Before dumping the databases it will clear ANY files from the backup directory.
#
# --------------------------------------------------------------------------------------------------------
#
# For more info on the switches for pg_dump and pg_dumpall use the --help switch with these 2 commands
#
# ===
# Parameters you can change
#
pg_tools="/Library/PostgreSQL/9.0/bin" # path to the pgsql command-line tools, adjust accordingly
pg_host="-h localhost" # leave empty if you want to use the local UNIX socket
# default in this script is "-h localhost" and will use TCP/IP
use_gzip=false # gzip the dump files
backup_dir="/pgsql-backup"
logfile="$backup_dir/postgresql_backup.log"
# ===
databases=`$pg_tools/psql $pg_host -U postgres -d postgres -l | sed -n 4,/\eof/p | grep -v rows\) | awk {'print$1'} | grep -v \| | grep -v template | grep -v postgres`
# clear backup folder for new data dump
rm $backup_dir/*
#rm $logfile # only needed when you want the logfile to be in another location other than the backup directory.
touch $logfile
# start logging
timeinfo=`date '+%T %x'`
echo "Starting Backup at $timeinfo" >> $logfile
for i in $databases
do
if $use_gzip; then
# plain SQL dump, zipped. Requires a query editor to restore (like psql)
$pg_tools/pg_dump $i $pg_host -U postgres | gzip > "$backup_dir/$i.sql.gz"
else
# plain SQL dump, non-zipped. Requires a query editor to restore (like psql)
# $pg_tools/pg_dump $i $pg_host -U postgres > "$backup_dir/$i.sql"
# database dump using the custom format. Requires pg_restore (or PgAdmin's restore function) to restore
$pg_tools/pg_dump $i $pg_host -U postgres -F c -f "$backup_dir/$i.dump"
fi
# do some logging
timeinfo=`date '+%T %x'`
echo "Backup complete at $timeinfo for database: $i" >> $logfile
done
# get users, roles and other global data
if $use_gzip; then
# zipped
$pg_tools/pg_dumpall -g $pg_host -U postgres | gzip > "$backup_dir/pg_global_data.sql.gz"
else
# not zipped
$pg_tools/pg_dumpall -g $pg_host -U postgres > "$backup_dir/pg_global_data.sql"
fi
# do some logging
timeinfo=`date '+%T %x'`
echo "Backup complete at $timeinfo for global data" >> $logfile
thank you very much for your help and the script!
The situation for me is that I have to administrate several Servoy app servers currently on Mac OSX and Debian 5. Furthermore I have changed the customers SQL db’s to PostgreSQL 9.0.1 because I would like to administrate them central and as easy as possible on only one computer (my MacBook Pro) by pgAdmin. I thought saving time to manage all about the db’s incl. the backup jobs in the pgAdmin (like in Sybase Central). But the script file I need (found in the Internet and modified) is not working correct. I always get errors like “permission denied” or “format not correct” while I’m testing the script by executing from Terminal.
Howerver, I will still try to get this to work.
I have tried your shell script for backing up my postgresql databases on a mac server by executing it in the Terminal. The script is working so far, but I must insert the password for the user postgres.
How can I modify the script for setting the password automatically?
And I get only the pg_global_data.sql file but no .sql file of the databases in my backup directory!?
I have modified your shell script and with that I got it to work except that issue:
pg_dump: [archiver (db)] connection to database "|" failed: FATAL: database "|" does not exist
This is the script:
#!/bin/bash
#
#This script will dump all databases of the PostgreSQL cluster (excl. template1, template0 and postgres)
#into a directory.
#After all databases are dumped then it will dump the global data like users/roles/etc.
#
#Backup times are logged in a log file in the same backup directory.
#
#NOTE and WARNING: Before dumping the databases it will clear ANY files from the backup directory.
#
# --------------------------------------------------------------------------------------------------------
#
#For more info on the switches for pg_dump and pg_dumpall use the --help switch with these 2 commands
#
# ===
# Parameters you can change
#
pg_tools="/Library/PostgreSQL/9.0/bin" # path to the pgsql command-line tools, adjust accordingly
pg_host="-h localhost -p 5432" # leave empty if you want to use the local UNIX socket
# default in this script is "-h localhost" and will use TCP/IP
use_gzip=false # gzip the dump files
export PGPASSWORD=xxxx
backup_dir="/Users/admin/Backup-PSQL"
timeinfo1=`date '+%y'%m'%d'`
logfile1="$backup_dir/pg_backup_start-$timeinfo1.log"
logfile2="$backup_dir/pg_db_backup_end-$timeinfo1.log"
logfile3="$backup_dir/pg_glo_backup_end-$timeinfo1.log"
# ===
# databases=`$pg_tools/psql $pg_host -U postgres -d postgres -l | sed -n 4,/\eof/p | grep -v rows\) | grep -v template | grep -v postgres | awk {'print$1'}`
databases=`$pg_tools/psql $pg_host -U postgres -d postgres -l | sed -n 4,/\eof/p | grep -v rows\) | grep -v template |awk {'print$1'}`
# clear backup folder for new data dump
#rm $backup_dir/*
#rm $logfile # only needed when you want the logfile to be in another location other than the backup directory.
touch $logfile1
# start logging
timeinfo=`date '+%T %x'`
echo "Starting Backup at $timeinfo" >> $logfile1
for db in $databases
do
if $use_gzip; then
# plain SQL dump, zipped. Requires a query editor to restore (like psql)
$pg_tools/pg_dump $db $pg_host -U postgres | gzip > "$backup_dir/$db.sql.gz"
else
# plain SQL dump, non-zipped. Requires a query editor to restore (like psql)
# $pg_tools/pg_dump $db $pg_host -U postgres > "$backup_dir/$db.sql"
# database dump using the custom format. Requires pg_restore (or PgAdmin's restore function) to restore
$pg_tools/pg_dump $db $pg_host -U postgres -F c -f "$backup_dir/$db.dump"
fi
# do some logging
timeinfo=`date '+%T %x'`
echo "Backup complete at $timeinfo for database: $db" >> $logfile2
done
export PGPASSWORD=
# get users, roles and other global data
if $use_gzip; then
# zipped
$pg_tools/pg_dumpall -g $pg_host -U postgres | gzip > "$backup_dir/pg_glo_data_$timeinfo1.sql.gz"
else
# not zipped
$pg_tools/pg_dumpall -g $pg_host -U postgres > "$backup_dir/pg_glo_data_$timeinfo1.sql"
fi
# do some logging
timeinfo=`date '+%T %x'`
echo "Backup complete at $timeinfo for global data" >> $logfile3
Do you/anybody have an idea how I can solve the problem?
You should use a password file.
This way you don’t have to put the password in your backup script (which is very insecure).
Just create a .pgpass file (on windows this is pgpass.conf) in the home directory of the postgres system user (for you that is in /Library/PostgreSQL/9.0/) and put in the following information:
hostname:port:database:username:password
So that might be something like this:
localhost:5432:*:postgres:xxxxx
Please read the linked page about setting up the correct unix permissions on this file.
As for your database “|” error. I tested your execution string and that all gives the correct result. I can’t see any other errors in your script either. I suggest you put some echo’s in the loop and see what data you get back.
I also see you got rid of the exclusion rule for the ‘postgres’ database. Why do you want to backup the default ‘postgres’ database ? Or did you put ‘postgres’ in your own database names as well ?
it’s curious, but if I include the postgres db to be backed up, all my databases would be backed up (inclusive the non existing “|” db). If I exclude the postgres db (grep -v postgres), only the global data is backing up!? That’s the reason why.
Now I have created a .pgpass file, but how must I change the script for integration?
export PGPASSWORD=~/.pgpass ??
BTW, I have changed the logging to only one file. For me it’s better to have all logs in one file…
tgs:
it’s curious, but if I include the postgres db to be backed up, all my databases would be backed up (inclusive the non existing “|” db). If I exclude the postgres db (grep -v postgres), only the global data is backing up!? That’s the reason why.
You mean you don’t have other databases in there than the templates and the default postgres database ?
If you do have databases in there how are they named ?
tgs:
Now I have created a .pgpass file, but how must I change the script for integration?
export PGPASSWORD=~/.pgpass ??
You can leave that part out of the script. pg_backup will look for the file itself. The script I posted in this thread works as is with it.
As what user do you run this script ?
tgs:
BTW, I have changed the logging to only one file. For me it’s better to have all logs in one file…
As you can see in the aforementioned script I also log to one file only.
ROCLASI:
You mean you don’t have other databases in there than the templates and the default postgres database ?
If you do have databases in there how are they named ?
Of course, I do have other databases like core_data, i18n_data, servoy_repository and so on. Backing up them is working if I include to backup the postgres db, but not if I exclude it!?? However, it’s not a real problem for me to have a backup of the postgres db.
ROCLASI:
You can leave that part out of the script. pg_backup will look for the file itself. The script I posted in this thread works as is with it.
As what user do you run this script ?
You are right, the script is working without asking for the password…cool… I have stored the .pgpass file in the user directory (~/.pgpass) and the script is running as admin user, so it works.
Next I will create a new crontab for executing the backup script twice a day automatically. But first I have to learn how to edit and save with vi. Normally I prefer to use the editor nano.
Ah, I found the issue.
Since I don’t use the postgres user as the owner of my own databases I don’t have this issue but you apparently do use the postgres user.
The following code should work for you:
your new code is working! Now all my databases are backing up without the template and postgres db. But there is already the issue
g_dump: [archiver (db)] connection to database "|" failed: FATAL: database "|" does not exist
This is my script now:
#!/bin/bash
#
#This script will dump all databases of the PostgreSQL cluster (excl. template1, template0 and postgres)
#into a directory.
#After all databases are dumped then it will dump the global data like users/roles/etc.
#
#Backup times are logged in a log file in the same backup directory.
#
#NOTE and WARNING: Before dumping the databases it will clear ANY files from the backup directory.
#
# --------------------------------------------------------------------------------------------------------
#
#For more info on the switches for pg_dump and pg_dumpall use the --help switch with these 2 commands
#
# ===
# Parameters you can change
#
pg_tools="/Library/PostgreSQL/9.0/bin" # path to the pgsql command-line tools, adjust accordingly
pg_host="-h localhost -p 5432" # leave empty if you want to use the local UNIX socket
# default in this script is "-h localhost" and will use TCP/IP
use_gzip=false # gzip the dump files
#export PGPASSWORD=xxxxxx # only needed if there is no auth. file eg: ~/.pgpass
backup_dir="/Users/admin/Backup-PSQL"
timeinfo1=`date '+%y'%m'%d'`
logfile="$backup_dir/pg_backup_$timeinfo1.log"
# ===
databases=`$pg_tools/psql $pg_host -U postgres -d postgres -l | sed -n 4,/\eof/p | grep -v rows\) | awk {'print$1'} | grep -v template | grep -v postgres`
# clear backup folder for new data dump
#rm $backup_dir/*
#rm $logfile # only needed when you want the logfile to be in another location other than the backup directory.
touch $logfile1
# start logging
timeinfo=`date '+%T %x'`
echo "Starting Backup at $timeinfo" >> $logfile
for db in $databases
do
if $use_gzip; then
# plain SQL dump, zipped. Requires a query editor to restore (like psql)
$pg_tools/pg_dump $db $pg_host -U postgres | gzip > "$backup_dir/$db.sql.gz"
else
# plain SQL dump, non-zipped. Requires a query editor to restore (like psql)
# $pg_tools/pg_dump $db $pg_host -U postgres > "$backup_dir/$db.sql"
# database dump using the custom format. Requires pg_restore (or PgAdmin's restore function) to restore
$pg_tools/pg_dump $db $pg_host -U postgres -F c -f "$backup_dir/$db.dump"
fi
# do some logging
timeinfo=`date '+%T %x'`
echo "Backup complete at $timeinfo for database: $db" >> $logfile
done
#export PGPASSWORD= # only needed if no auth. file!
# get users, roles and other global data
if $use_gzip; then
# zipped
$pg_tools/pg_dumpall -g $pg_host -U postgres | gzip > "$backup_dir/pg_glo_data_$timeinfo1.sql.gz"
else
# not zipped
$pg_tools/pg_dumpall -g $pg_host -U postgres > "$backup_dir/pg_glo_data_$timeinfo1.sql"
fi
# do some logging
timeinfo=`date '+%T %x'`
echo "Backup complete at $timeinfo for global data" >> $logfile