Cracking the Juniper Network Connect problem on Linux 64 bit

One of the most frustrating problems that I have ever worked on was to get this darn VPN connect work and I used a 32 bit Ubuntu for over a year and half on a 64 bit machine only because I could not get the VPN connect work.

Finally, with _due_ help from a colleague Ebben – I have a way to get things going on 64 bit Mint 12 (moved from Ubuntu recently as I hate Unity…).

The most useful resource in this regard are the Mad Scientist page and the Ubuntu forum thread  discussing this issue. Please refer these pages for 32 bit setup. I am devoting this post only for 64 bit and assuming most things are in place or you know how to set it for a 32 bit Linux.

Hoping the following are more than sufficient to get things going (at least on Mint which comes with Sun Java in place but I put the steps nevertheless) –

1. Install Opera – I chose opera as it works well for me and I could not get the 32 bit plugin work with the other 64 bit browsers. If it works for you kindly let know.

You can install the package using the command below –

$ sudo apt-get install opera      # assuming the repository is in place if not refer this.

2. Install Sun Java – Excerpt from the Mad Scientist page

On 32-bit Ubuntu I use:

sudo aptitude install sun-java6-plugin sun-java6-jdk sun-java6-jre

On 64-bit Ubuntu I use the above plus this:

sudo aptitude install ia32-sun-java6-bin

3. Get the 32 plugin in place

Ensure that the 32 Java plugin is pointed to from the browser of your choice. In opera –

Check the page opera:about for the plugins path and

$ cd /usr/lib/opera/plugins
$ sudo ln -s /usr/lib/jvm/ia32-java-6-sun-

You can test the plugin using the URL – about:plugins

For opera it translated to opera:plugins

On refresh plugins I saw the Java related plugins in place –

Java(TM) Plug-in 1.6.0_26

Java(TM) Plug-in 1.6.0_26

Java(TM) Plug-in 1.6.0_26

NOTE: A restart of the browser might be necessary if you have already opened it – it worked for me without a restart!🙂

4. You are good to go

Just key in the URL that connects you and this works like a charm as it would on a 32 bit Linux.

Hope this post helps you save _some_ time.

NOTE: I see that on 12.04 Sun Java can’t be installed using any repositories due to broken dependencies. I tried installing Oracle 7 as suggested here –

I followed these steps –

$ wget 
$ chmod +x

$ sudo ./ -7
$ sudo apt-get install oracle-java7-jre oracle-java7-jdk oracle-java7-plugin oracle-java7-fonts

Then I added the 32 bit plugin - 

$ sudo apt-get install ia32-oracle-java7-bin
$ cd /usr/lib/opera/plugins
$ sudo ln -s /usr/lib/jvm/ia32-java-7-oracle/jre/lib/i386/

Ok for those none of the above works there is some relief -

Ebben has also written this wonderful script jvpn which works like a charm for me now –

If the browser work around is failing or you want a better mechanism you can try Ebben’s script instead. These are the steps for jvpn to work


1. Install the following –

$ sudo apt-get install ia32-libs gcc-multilib

2. Log into the Juniper Network Connect via browser to install the ~/.juniper_networks folder.

3. Remove the ncui binary –

$ rm .juniper_networks/network_connect/ncui

4. Run the jvpn script –

$ jvpn --site=XXX --curses
ncui binary does not exist, would you like to compile now? (Y/N): Y
>> jvpn    : INFO     ncui binary does not exist, searching for shared object for compilation
>> jvpn    : INFO     ncui compiled successfully
>> jvpn    : INFO     Initiating VPN to:
>> jvpn    : INFO     Authenticated
>> jvpn    : INFO     DSID found - caching to: /home/mynk/.jvpn/cache/
>> jvpn    : INFO     Certificate file not found [/home/mynk/.jvpn/certs/]
>> jvpn    : INFO     Retreiving server certificate from:
>> jvpn    : INFO     All values retreived - Connecting to VPN gateway
>> jvpn    : INFO     ncui started with pid: 8490
>> jvpn    : INFO     Connected to VPN successfully

Hope this works for you!🙂

How to convert a DVD into an AVI on Linux using devede and mencoder?

I don’t know if there is an easier solution (please share if you know one) –

# Recent Edit – Try this tool before proceeding –

To convert the DVD into a compact format you can use devede – # The link shows how to convert AVI to DVD.

To do vice versa – DVD to AVI –

1. Install devede –

$ sudo apt-get install devede         # works on Ubuntu

2. Run devede –

$ devede &

3. Select the Divx / MPEG -4 option

4. Add the DVD videos of interest

Click on Add below

Choose the .VOB files of interest (I think it allows only one at a time)

When done with the addition –

3. Choose the appropriate media size & click on ‘Adjust disc usage’ button

4. Click Foward

and choose destination folder

5. Quit when done

6. You will get a .avi file for each .VOB file. You can join them into one in the destination folder using –

$ mencoder -oac copy -ovc copy movie_01.avi movie_02.avi movie_03.avi -o WholeMovie.avi


There should be an easier way to do this!

How to resize photographs on Linux – The convert utility?

I was struggling to resize photographs when I found the convert utility –

You can also check –

I got this simple loop in place –

alias cvt=’for img in $images ; do convert -sample 25%x25% $img $(basename $img) ; done’

images can be initialized as desired

images=$(ls *.jpg)  # This will resize the pics in place so you better have a backup

Need to get a command to convert raw files to jpeg. Later…

Training the Teachers Program

I had this wonderful opportunity to join some FLOSS enthusiasts in their effort to train the teachers from the various Government schools across Bangalore at the Sarva Shiksha Abhiyaan. The idea was to equip the teachers with knowledge to teach the many students they work with. Here is a report I wrote about the same which shall hopefully be used once we have completed the next phase of the training.

“First they ignore you, then they ridicule you, then they fight you, then you win.”
— Mahatma Gandhi

With this message a few geeks, people from academia, students, etc from various groups are advocating FLOSS in the government schools and offices. “Karnataka Government has decided to use free software for 8th 9th and 10th standards as a part of their ICT@Schools project(Information and Communication Technology). As of now It is vendor controlled project (Educom, EVeron, etc are implementing it) under BOOT (Build Operate Own & Transfer) scheme. For making it more accessible to teachers & students(IT enabled learning), we are initiating this training as a pilot project in bangalore. This is organised by DSERT and a Consortium of Free Software Organisations in Bangalore (FSUG Bangalore, Sampada, FSMK, ITfC, DeepRoot Linux & Moving Republic are the organisations associated.)”, explained Mr. Anivar Aravind, from Moving Republic.

education, FLOSS

Mr. Vikram Vincent from FSMK said, “The plan is to reach out to all schools in Karnatka. The training shall be conducted in 2 phases, 3 days of training each, covering 17 schools. The teachers from these schools as well as the faculty from the vendor shall be trained as a part of this program.”

A training was organized for Teachers and Faculty from the vendors at Sarva Shiksha Abhiyan Building, Nrupatunga Road (oppo UVCE college) Bangalore. They were trained on basic usage of DebianEdu (An educational version of the popular Linux Distribution – Debigan). The training spread across three days from 16th July through 18th July and later from 23rd July through 31st July covered:

1. A session on FLOSS (Free/Libre/Open Source Software) philosophy.
2. Basic usage of the desktop explaining the various hardware components.
3. Use of Open Office.
4. Educational games on the distro.
5. Use of Internet.

The hall was packed with almost every system occupied with about 25 teachers. The teachers very pretty enthusiastic about the training and sat patiently for the long sessions absorbing as much as they could. One teacher in particular didn’t mind delayed lunch as they were in middle of something.

The first day was mostly basic stuff covering the different components like mouse (why it is so called?), CPU, etc. Many teachers were curious about typing and use of kannada. Ktouch, gtypist, gcompris, etc were suggested to the them. The afternoon session Naveen introduced the teachers to the concept of FLOSS. Naveen used the analogy of a bike. When someone buys a bike (s)he expects to use it the way (s)he wishes. If a neighbor is in need and comes ask for the bike for some work (s)he is free to lend the bike which may not be the cased with a product from Microsoft or Apple or such. If it has some problem one would like to have a look at the bike himself or herself first and only then goto a shop. Even if (s)he has to goto a shop the person is free to go to a local shop and doesn’t have to go to the same vendor who sold the bike. This ‘freedom’ is not there in closed source software. There was a mention of how Einstein was able to come up with concepts like theory of relativity as Newton, who said, “If I have seen further it is only by standing on the shoulders of giants.”, didn’t keep his knowledge to himself. The way to go ahead in technology is to share and exchange knowledge. Reminds one of the words from George Bernard Shaw –

“If you have an apple and I have an apple and we exchange these apples then you and I will still each have one apple. But if you have an idea and I have an idea and we exchange these ideas, then each of us will have two ideas.”

The second day was entirely spent on Open Office where equivalents of Microsoft Word, Excel and Powerpoint. People who were familiar with the office suite asked how they could do certain things they did in the MS counterparts. The sessions were exhaustive and covered most of the things the teachers would need. The day ended in a presentation on a one of the schools made by the Teacher from that school. Looked pretty well done. The teachers were briefly introduced to the use of Kannada – the idea was to encourage them to use English. The detailed coverage on the same is planned in the subsequent sessions.

The third day was devoted to the Internet. The teachers were also requested to work in groups to infuse in them idea of sharing and helping each other which is the basis for most development in FLOSS. The idea behind the sessions was to equip the teachers to help themselves by looking for help on the net. They were introduced to checking emails, google, wikipedia, youtbue, etc. An email was created for each teacher and also a mailing list containing all of them for future queries and discussions. The teachers expressed their heart felt gratitude.

Mr. Guru Murthy from ITfC (IT For Chagne) said, “It is important that teachers get hands on. In kerala there were 56 computers in one particular school. Not a single school where teacher said – I don’t know computers. Teachers need to be experts and be able to answer questions. There is plan to have a projector for each school and use educational software to keep the children engaged.” He asked the teachers, “How many of you have TV at home?” Almost all the participants raised their hands. He continued, “How many of you have a computer?” Fewer hands went up. “How many have a cable connection at home?” Again most hands went up. “How many of you have an internet connection?” Only a few hands went up this time. He said, “It is a must that all of you invest in computers as it is good for your students and your own children. You should be able to help others out – be a Computer Resource Person. You have a choice in 2 years time to be either a computer literate or illiterate. This program completely depends on your interest. Also, get an internet connection and please note that there is a discount for Govt employees.”

“God created life, doctors help preserve it, engineers make it comfortable but teachers make it worth living.” These teachers have been given a chance to prove this adage.

What did the teachers have to say about the program?

Mr. P Manjuroopa, from Governament composite high school, Kakolu said, “When I got the memo of training, I was upset because I am teaching maths to my students in school & I have not completed my portion of june and was busy. But after reaching here & attending frist day training I was totally impressed by this progremme. After attening this training I got the confidence on my self that I too can mail & chat. I loved the concept of free software fundation(FSMK). I felt happy learning GNU/Linux found it really wonderful!”

Ms Tejaswini K, Computer faculty at Govt. PU College for Boys, told that she was extremely grateful for the session as she learnt a good deal and would be able to help the students much better now.

“We have attended computer training held at SSA building for three days from July 16th tthrough 18th. Here some experts are worked as guide. They gave us lot of information about linux and free software. Before that information we didn’t know anything. First I had curiosity about training but last day I’m happy to have attended the training. Vinay, Naveen, Anupama, Raghavendra, Shiv prasad and other experts impressed us well their knowledge and attitude. I promise to use this knowledge effectively in our school” said Mr Ajjaiah G, AM Govt high school, Kakolu.

The second batch was organized much better with the learnings from the first batch. The sessions were more interactive and the exercises at the end of sessions kept the teachers engaged. Lesser theory and more work on computers made it further interesting. Faculty and students from RV college led by Mr. Reunkaprasad B took great interest in imparting the knowledge on FLOSS.

The follow up sessions shall be organized in the next month to see the progress and cover the advanced topics.

If this initiative interests you please get in touch with

Naveen: / 998640 3928
Vikram: / 94488 10822

Software _must_ be accessible to people who could use it to bring about a positive change in their lives. Such programs are only making this a reality by touching the needier sections.

ICT Teacher Training

Making a Makefile

Without a makefile:
$ ls

$ make foo
cc foo.c -o foo

Useful Make options: (For details: man make or pinfo make)

-B, –always-make           Unconditionally make all targets.

Change to DIRECTORY before doing anything.

-d                          Print lots of debugging information.

-i, –ignore-errors         Ignore errors from commands.

-j [N], –jobs[=N]          Allow N jobs at once; infinite jobs with no arg.

-k, –keep-going            Keep going when some targets can’t be made.

-n, –just-print, –dry-run, –recon
Don’t actually run any commands; just print them.

-s, –silent, –quiet       Don’t echo commands.

A Make rule is composed of:

target: prerequisites

A target is considered “up to date” if it exists and is newer than its prerequisites. A rule tells make two things: when the targets are out of date, and how to update them when necessary.

Use .DEFAULT_GOAL to override the default goal. MAKECMDGOALS gives the list of targets specified with make (eg. make clean – $(MAKECMDGOALS) = clean)

Within a command script (if the line begins with a TAB character) the entire line is passed to the shell, just as with any other line that begins with a TAB. The shell decides how to interpret the text. Prefixing a command with ‘@’ suppresses echoing (printing of the command during make) while prefixing ‘-‘ ignores errors on executing command.

By default, when make looks for the makefile, it tries the following names, in order:
‘GNUmakefile’, ‘makefile’ and ‘Makefile’.

Makefile is recommend as prominent along with README and the likes.

Makefile Variables

As a project gets larger, more files are usually added. If you repeat a list of files, you can accidentally leave files out of the list. It’s simpler to make use of a variable that expands into the list.

The syntax for declaring and setting a makefile variable is varname = variable contents. To call the variable, use $(varname).

OBJECTS := $(wildcard *.o)               # Also an example of wildcard
OBJECTS := $(patsubst %.c,%.o,$(wildcard *.c))  # using functions

Variable Assignment

Assignment Types:
=  for delayed assignment – recursively expanded variables.
:= assigns immmediately – simply expanded variables. Recommended for faster Makefiles.
?= if you want to conditionally assign a variable – assigned only if not defined already.
+= is used for appending.

Variable definitions are parsed as follows:
immediate   = deferred
immediate   ?= deferred
immediate   := immediate
immediate   += deferred or immediate
define immediate                # same as immediate = deferred
For the append operator, ‘+=’, the right-hand side is considered immediate if the variable was previously set as a simple variable (‘:=’), and deferred otherwise.

If you want to export specific variables to a sub-make, use the export directive, like this:
export variable …
export variable = value
export variable := value
If you want to prevent a variable from being exported, use the unexport directive, like this:
unexport variable …

Some pre-defined variables:

Variable    Description
$@    This will always expand to the current target.
$<    The name of the first prerequisite. This is the first item listed after the colon.
$?    The names of all the prerequisites that are newer than the target.
$^    The names of all the prerequisites, with spaces between them. No duplicates
$+    Like $^, but with duplicates. Items are listed in the order they were specified in the rule.

Environment Variables
Variable can also be exported as environment variables that can be checked in the Makefile e.g LDFLAGS could be exported as an environment variable with desired values.

You can also check if an environment variable has been set and initialise it to something if it has not. ie.

DESTDIR ?= /usr/local

will set DESTDIR to /usr/local if it is not already defined

To append to the environment variables use the += operator:

CFLAGS += -g -Wall


It is possible to call some predefined functions in makefiles. A full list of them can be found in the manual, of course (

Perhaps you want to find all the .c files in directory for later use:

SOURCES := $(wildcard *.c)

Given these, maybe you want to know the names of their corresponding .o files:

OBJS := $(patsubst %.c, %.o, $(SOURCES))

You can do things like adding prefixes and suffixes, which comes in handy quite often. For example, you could have at the top of the makefile a variable where you set the libraries to be included:

LIBS := GL SDL stlport

And then use

$(addprefix -l,$(LIBS))

in a later rule to add a -l prefix for every library mentioned in LIBS above.

Finding files in multiple directories is a good example of the usage of foreach

DIRS := src obj headers
FILES := $(foreach dir, $(DIRS), $(wildcard $(dir)/*))

Substitution References
foo := a.o b.o c.o
bar := $(foo:.o=.c)  or
bar := $(foo:%.o=%.c)  # same as $(patsubst %.o,%.c,foo)
Gives bar = a.c b.c c.c

List of Functions:
$(subst from,to,text)            – Replace from with to in text
$(patsubst pattern,replacement,text)    – Replace matched part with replacement in text
$(var :suffix=replacement)        – Replace suufix with replacement in var
$(strip string)             – Remove white spaces
$(findstring find,in)            – Search find in in
$(filter pattern…,text)        – Returns matched patterns in text
$(filter-out pattern…,text)        – Returns all that don’t match
$(sort list)                – Sort the elements in list
$(word n,text)                – Returns nth word of text
$(words text)                – Returns words in text
$(firstword names…)            – Returns fist name in the series of names
$(lastword names…)            – Returns the last name in the series of names
$(foreach var,list,text)        – Loop over list assigning var to each name and then expand text
$(call variable,param,param,…)    – call variable with arguments stored in $1, $2, etc.
File related:
$(dir names…)                – Extracts directory part of the path in names
$(notdir names…)            – Extracts filename from the path in names
$(suffix names…)            – Extracts suffixes from names (following ‘.’)
$(basename names…)            – Extracts all but suffix (preceding ‘.’)
$(addsuffix suffix,names…)        – Appends suffix to the names
$(addprefix prefix,names …)        – Prepends prefix to the names
$(join list1,list2 )            – Concatenates each word of list1 with corresponding word of list2
$(wildcard pattern )            – Retursn space separated list of matches
$(realpath names …)            – Absolute path resolving symbolic links
$(abspath names …)            – Absolute path
Non reference:
$(value variable)            – Exact value of the name (not reference without ‘$’)
$(eval varirable)            – Creating new constructs
$(origin variable)            – Origin of variable – default, environment, command line, etc
$(flavor variable)            – Type of variable – undefined, simple or recursive.
$(shell command)            – Space separated output of command on shell
$(error text …)            – Generates an error displaying text
$(warning text …)            – Same as error except processing continues
$(info text …)

Phony targets

A phony target is one that is not really the name of a file. It is just a name for some commands to be executed when you make an explicit request. There are two reasons to use a phony target: to avoid a conflict with a file of the same name, and to improve performance.

You can have “phony” targets — targets which don’t actually create a file, but do something. These are created like normal targets: for instance, to add a “all” target to our makefile we’d add (probably at the top, so it becomes the default target):

all: foo

This rule won’t run if there exists a file called “all” in the directory (if someone was stupid enough to create one somehow). So we can tell make(1) that this is a phony target and should be rebuilt always this is by using the target .PHONY. so, we can add to our Makefile:

.PHONY: all

To add a clean target is fairly simple too, add:

.PHONY : clean
clean :
-rm edit $(objects)

and add clean to the list of phony targets:

# Installing the final product
cp sample /usr/local
echo install: make complete

.PHONY: all clean install

Another interesting example for PHONY is subdirectories:

SUBDIRS = foo bar baz
for dir in $(SUBDIRS); do \
$(MAKE) -C $$dir; \

Issues with the above method the error from a make in the loop is lost. If we do check then ability to work with -k would be lost. Targets can’t be built in parallel since there is only one rule.

By declaring the subdirectories as phony targets (you must do this as the subdirectory obviously always exists; otherwise it won’t be built) you can remove these problems:

SUBDIRS = foo bar baz
.PHONY: subdirs $(SUBDIRS)
subdirs: $(SUBDIRS)
$(MAKE) -C $@
foo: baz

Here we have also declared that the ‘foo’ subdirectory cannot be built until after the ‘baz’ subdirectory is complete; this kind of relationship declaration is particularly important when attempting parallel builds.

A phony target should not be a prerequisite of a real target file; if it is, its commands are run every time make goes to update that file. As long as a phony target is never a prerequisite of a real target, the phony target commands will be executed only when the phony target is a specified goal.

Target-specific Variable Values

target … : variable-assignment

The variable-assignment can be any valid form of assignment; recursive (‘=’), static (‘:=’), appending (‘+=’), or conditional (‘?=’). All variables that appear within the variable-assignment are evaluated within the context of the target: thus, any previously-defined target-specific variable values will be in effect.

There is one more special feature of target-specific variables: when you define a target-specific variable that variable value is also in effect for all prerequisites of this target, and
all their prerequisites, etc. (unless those prerequisites override that variable with their own target-specific variable value). So, for example,

We want to write a recursive makefile to enter sub-dirs then instead of duplicating rules as below:

SUBDIRS = src test

SUBDIRS_DEBUG = $(addsuffix .debug, $(SUBDIRS))
SUBDIRS_CLEAN = $(addsuffix .clean, $(SUBDIRS))


all:    $(SUBDIRS)
@echo make in $@…
$(MAKE) -C $@

debug:    $(SUBDIRS_DEBUG)
@echo make debug in $@…
$(MAKE) -C $(basename $@) debug

clean:    $(SUBDIRS_CLEAN)
@echo make clean in $@…
$(MAKE) -C $(basename $@) clean

We could use the target specific variable $(target) as shown below:

#Entering recursively into subdirectories and executing different make commands without duplication
target :=

subdirs := src test

.PHONY: $(subdirs) all clean

#For all target is NULL
all :    $(subdirs)

clean :    target := clean
clean :    $(subdirs)

debug : target := debug
debug : $(subdirs)

#Recursively run the make through the subdirs with the same target
$(MAKE) -C $@ $(target)

Pattern-specific Variable Values

pattern … : variable-assignment

As with target-specific variable values, multiple pattern values create a pattern-specific variable value for each pattern individually. The variable-assignment can be any valid form of assignment. Any command-line variable setting will take precedence, unless override is specified.

For example:
%.o : CFLAGS = -O
will assign CFLAGS the value of ‘-O’ for all targets matching the pattern %.o.

VPATH: Search Path for All Prerequisites
The value of the make variable VPATH specifies a list of directories that make should search. Most often, the directories are expected to contain prerequisite files that are not in the current directory; however, make uses VPATH as a search list for both prerequisites and targets of rules.

For example,
VPATH = src:../headers
specifies a path containing two directories, ‘src’ and ‘../headers’, which make searches in
that order.

The vpath Directive
Similar to the VPATH variable, but more selective, is the vpath directive (note lower case), which allows you to specify a search path for a particular class of file names: those that match a particular pattern.

For example,
vpath %.h ../headers
vpath %.c foo

So .h files will be searched in ../headers & .c in foo.

Automatic dependency calculation

If you are creating a Makefile for C/C++ gcc can calculate dependency information for you. The quickest way to get this going is to add the -MD flag to your CFLAGS first. You will then need to know the names of the .d files in your makefile. I do something like this:

DEPS := $(patsubst %.o,%.d,$(OBJS))

Then near the end of the makefile, add an

-include $(DEPS)

It might also help to make a ‘deps’ target:

deps: $(SOURCES)
$(CC) -MD -E $(SOURCES) > /dev/null

‘-E’ tells gcc to stop after preprocessing. When using -E, the processed C file is sent to STDOUT. Therefore to avoid the mess on the screen, send it to /dev/null instead. Using this command all of the *.d files will be made.

The configure script and the Makefile rules for building and installation should not use any utilities directly except these:

cat cmp cp echo egrep expr grep ln mkdir mv pwd rm rmdir sed test touch

Stick to the generally supported options for these programs. For example, don’t use `mkdir -p’, convenient as it may be, because most systems don’t support it.

The Makefile rules for building and installation can also use compilers and related programs, but should do so via make variables so that the user can substitute alternatives. Here are some of the programs we mean:

ar bison cc flex install ld lex make makeinfo ranlib texi2dvi yacc


Linux Kernel Makefile:
This is one of the best reference as it contains an example for almost every feature of make!🙂

GNU Make Manual:


Wannabe a kernel hacker???

This post is for an absolute beginner…🙂

Linux kernel can be very intimidating and I am still struggling to catchup with the code. I was fascinated by list.h that gives an implementation of a generic linked list merely using macros!🙂 I was helped through the process management by a colleague – a good place to start. I loved the bit on scheduling and the neat tricks used there. A lot of people wud advise u to join the lkml mailing list but it would be good to know something so that u can understand the exchanges there. kernelnewbies is a better place to be.

I can recommend a lot of stuff but will do it in an order that will help. For those who love crash courses here is a nice one – Can download the PDF and try the module quickly. The 2nd example requires a hardware. If u can get that it wud be help greatly in understanding.

You should get yourself a copy of Linux Kernel Development by Robert Love. It is a very good place to start and will briefly discuss code snippets to get you started and clears the concepts very well.

You can start working on Linux Device Drivers (LDD) by Alessandro Rubini (a handy set of chapters in PDF available online and helps since u can use while coding) soon after or in parallel. The coding examples in this book are a must!!! I am not sure I can this book enuf!!!🙂 I got a hard copy of the same despite the free soft copy. I lost it and I m getting another copy. Trust me it is worth it – don’t take a print – get the book!!!

Recently I tried Understanding the Linux Kernel by Bovet – the latest edition and it is pretty detailed if you are interested in further details.

The most important link I found was – (Best link to get started off! A good site for other stuff as well)

This single presentation has very good pointers and is pretty up to date unlike other links.

A good link to begin from:

Mind the resource section of any IBM link – they are pretty neat!!!🙂

Also recently I bumped into the following link:

This has the boot process discussed very well.

Also a friend who has written a 32 bit OS had posted the following :

This is dated but he gives tips on how to get started! I would have advised to go thru the following link and start from main.c (briefly mentions the first few api invoked)

but then it is very difficult. Like Karthick mentions it will be good start with isolated bits like ipc and then move to process management, etc.

I have my bookmarks available on Linux Kernel Internels at –

Finally I wud like to share the most interesting link on this subject – not useful technically but just to know that a Doctor turned into a hacker merely looking at Kernel code (he did not know C!!!) and also delivered patches!!!

I have found it hard to remember the changes between 2.4 and 2.6 so recommend u to stick to 2.6 as far as possible. The following link is good on the changes b/n the two:

I shall try and keep updating this post. If you do have some good pointers as well please share in the comments. Spammers please excuse!🙂

Finaly delving into the sources:

The source is the best place to dig into and my initial struggle was to get the ctags and cscope work right. With the many architectures supported it has always been difficult to late the right tag. A friend suggested

make ctags && make cscope

in the kernel source directory. To build for a specific architecture you need to specifiy the same. For example,

make ARCH=arm ctags && make ARCH=arm cscope

You may want to add –extra=+f option in the Makefile agains the ctags command. Helps you tag into the files additionally.

With this in place you could use your favorite editor (if not emacs or vim – MUST reconsider! :)).


PS: This blog was started on blogger – but since I would be a _staunch_ supporter of open source & I thought it would be best to move to an open source tool for a blog on technology!🙂 Will migrate rest of my blogs in some time…🙂