Monday, November 14, 2011

Start and access Amazon EC2 micro ubuntu instance.

Logon/sign in: http://aws.amazon.com/ec2/

Start an ubuntu micro-image, Eg:
ebs/ubuntu-images-milestone/ubuntu-oneiric-alpha2-amd64-server-2011*** (ami-00b14b69)

create a keypair:_ something.pem

change the permission of something.pem to 400:
sudo chmod 600 something.pem

create security group:
- open ssh, tcp - in the security group, by editing/adding rules.

access the virtual machine using ssh, using public DNS of instance:
For ubuntu machines, default user is ubuntu:

So, access it like:
ssh -i something.pem ubuntu@ec2-67-202-**-***.compute-1.amazonaws.com

To install gcc/openmpi

sudo apt-cache search make g++ libopenmpi-dev openmpi-bin openmpi-doc build-essential gcc-multilib libstdc++6
sudo apt-get -y update
sudo apt-get -y install make libopenmpi-dev openmpi-bin openmpi-doc build-essential gcc-multilib libstdc++6

Sunday, March 13, 2011

iweb parsing error - fix

If there is a Parse error: syntax error, unexpected T_STRING, etc in publishing a website developed using iweb on web hosting servers, try the following. I got it fixed.

  • create a file named ".htaccess", where you put "index.html" (or folder called "public", some say - I havent tried this).
  • Add either of the following line to that file:
    • php_flag short_open_tag off
      -- OR -- 
    • php_value short_open_tag 0
That should fix the problem. I got the tip from the following link. It works for me.

http://answers.yahoo.com/question/index?qid=20081022040450AAY1T8u

Thursday, February 10, 2011

BACKUP USING RSYNC & CRON



I wanted to backup my home directory using rsync to a separate drive, and to make it happen automatically.
I spent ages doing research into the various commands, and found everything I needed to know, but not all in the same place. While it was fun for me, others may just want to know how to do it immediately.  So here goes!
The rsync command
sudo rsync -av --progress --delete --log-file=/home/your-username/Desktop/$(date +%Y%m%d)_rsync.log --exclude "/home/your-username/.gvfs" /home /media/HomeBackup
the -av bit: 'a' means archive, or copy everything recursively preserving things like permissions, ownership and time stamps. The 'v' is verbose, so it tells you what its doing, either in the terminal, or in this case, in the log file.  --progress gives you more specific info about progress.
--delete checks for changes between source and destination, and deletes any files at the destination that you've deleted at the source. --log-file saves a copy of the rsync result to a date-stamped file on my desktop.
--exclude leaves out any files or directories you don't want copied. In my case, the .gvfs directory in Hardy Heron was a pain as even with sudo it errored and wouldn't copy properly, so I excluded it (Its not necessary to copy it anyway)  If you don't use Hardy yet, or any distro using the latest Gnome, skip this line, or upgrade!
/home is the directory I want copied. /home copies the directory and its contents, /home/ would just copy the contents
/media/HomeBackup is the separate drive.  Change this to whatever your backup location is. You can actually have this drive off-site and use ssh, but that will be a tutorial for another day!
The bash script
I was just pasting this command into Terminal each day, but wanted something automatic, so step one was a bash script.
Very easy, just open a new document in your favourite text editor, and type #!bin/bash followed by the command itself on a new line. So:
#!/bin/bash
sudo rsync -av --progress --delete --log-file=/home/your-username/Desktop/$(date +%Y%m%d)_rsync.log --exclude "/home/your-username/.gvfs" /home /media/HomeBackup
Save that as rsync-shell.sh on your Desktop and make it executable by typing
sudo chmod +x /home/your-username/Desktop/rsync-shell.sh   
or by right-clicking the file, select Properties, Permissions and then checking the Execute box
You can now double click that .sh file, choose 'Run in Terminal', it will ask you for your password and run, then leave a log file on your desktop.
or, you can make a cron job to do it for you!
The cron job
My biggest obstacle with this was the sudo bit. rsync won't be able to backup all files, or delete any, without root privileges. I didn't want to have to be there when it runs to type in my password, but after a bit of searching I found out how to make a root cron job.
Copy your .sh file to /root by typing
sudo cp /home/your-username/Desktop/rsync-shell.sh /root
Then type
sudo crontab -e
You'll see a line which reads:   # m h  dom mon dow   command
Under that type
0 22 * * * /root/rsync-shell.sh
What this all means is:
1. The number of minutes after the hour (0 to 59)
2. The hour in military time (24 hour) format (0 to 23)
3. The day of the month (1 to 31)
4. The month (1 to 12)
5. The day of the week(0 or 7 is Sun, or use name)
6. The command to run
So at 22:00 (10pm) every day root will run the shell script, without prompting you for sudo password (because its running as root already)
Now press Control-X, then type Y, then press enter.
You'll see   crontab: installing new crontab
And you're done!
 
"I hope this helps!" <--- I borrowed this line from Verbal, but I am sure he's made it Creative Commons!


The original quoted link:
http://www.linuxbasement.com/content/backups-using-rsync-bash-cron

Follow this link to create loginless ssh:
http://blogs.sun.com/jkini/entry/how_to_scp_scp_and

Friday, January 28, 2011

apt-get - how to update, upgrade

It is not necessary that apt-get fetches the latest versions you needed always. You have to update the list of places from where new versions can be fetched.

It works like this:

apt-get looks at:

/etc/apt/sources.list

You have to update that file with proper links:

This post explains it well: https://help.ubuntu.com/community/Repositories/CommandLine

remember Ubundu 8.xx is called hardy, other versions have got different names.. find out and use accordingly.
use: "cat /etc/issue" or "lsb_release -a" to find out the version of linux u r running

once updated the list use
man apt-get to know how to update and upgrade the version list.

then use apt-get install .... to update your libraries with the new versions.



Installing different versions of open mpi - NFS cluster

It can be bit tricky.

Basically we can have different versions of open mpi running on the same machine, because they can work independent if we install them in different folders.

choose an NSF folder which is common to all machines. Eg: /opt
create a folder called /opt/mpi
create a folder for the version you are planning to install. Eg: /opt/mpi/ompi-4.3.1
(u need root privileges)

Now download tar (.gz) file of the version u r planning to install from openmpi site.

http://www.open-mpi.org/software

(better not to build from the svn checkout version. get the native one)

Now make sure that your system has got the correct versions of autoconf, automake, libtool, gcc, ffortran, g++ etc
(this link will help you to know the requirements: http://www.open-mpi.org/svn/building.php)

if apt-get does not get you the latest or the right version, manually install them.
Eg: from the gnu site: ftp://ftp.gnu.org/gnu/

(manual installation tips:
./configure,
./make all install

you may have to set the correct symbolic links using "ln -s" command at: /usr/bin/ as well.

check versions using --version command. Eg: gcc --version
)

Once your env is ready, start installing open-mpi.

Unzip the tar file

(in the below step, adding prefix is important, otherwise the installation will conflict with the previous or next version installations)

./configure --prefix=/opt/mpi/ompi-4.3.1

(a lot of output)

./make all install

(a lot of output)

If you are lucky, your new open mpi will be installed at /opt/mpi/ompi-xxx folder, without any fuss. Or address the bugs :( and install.

Now, set the PATH and LD_LIBRARY_PATH at .bashrc file to /opt/mpi/ompi-xxx/bin and /opt/mpi/ompi-xxx/lib as follows:

PATH=/opt/mpi/ompi-1.4.3/bin:$PATH
export PATH

LD_LIBRARY_PATH=/opt/mpi/ompi-1.4.3/lib
export LD_LIBRARY_PATH

Logout and login again....
Now this open mpi version should be accessible by all nodes in the NFS cluster. Happy open-mping..

About Me

My photo
Jaison Paul Mulerikkal is the Vice Principal of Rajagiri School of Engineering and Technology (RSET), Kochi. He was the Principal of Jyothi Engineering College (JEC), Cheruthuruthy, Trissur, India. He is a member of CMI Sacred Heart Province, Kochi. He is a civil engineer by profession, but did his Masters in Information Systems from RMIT University, Melbourne, Australia and subsequently received his PhD in High Performance Scientific Computing from the Australian National University. He had worked as a computational scientist at the University of Auckland, New Zealand.