Tuesday, November 17, 2009

one .profile to rule them all

UPDATED POST/VERSION HERE (11/2012)

================ORIGINAL POST BELOW================

I got sick of having to try to copy and make major modifications for each environment that I was in. As a consultant I sometimes have to deal with ridiculous policy driven constraints as to what software I can or cannot use. Sometimes it's simply ignorance (like when the CTO of a company said that I could only script in a "...standard shell like BASH..." nothing obscure like KSH) but whatever the case it's a pain when you've grown accustom to something as personal as your shell settings. Now I have a solution.

The shell is often taken for granted by many a unix user. Some (dare I say most) don't even think about the shell they're using until they try a feature that they've seen somewhere and realize that it doesn't work. Others have extreme bias toward a family of shells basically C(sh) derived shells (Csh, TCsh, Zsh??, etc) vs bourne-like shells (Ksh, BAsh, Zsh??, Ash, DAsh, etc). Still others want a specific version of a particular shell. Because so much functionality has cross-pollenated the shell landscape it can be rather vexing to discover that one feature you've come to expect form your shell isn't available in the shell you're currently stuck using, or maybe has a different syntax or is otherwise mysteriously incongruent with your habits.

Well I can't solve all of your problems with shells. What I can do however is make shell initialization easier (for myself and hopefully for you as well) with a new hefty .profile/.shrc file. Each shell has it's own rules on how it uses profile and rc files. In the past there were things that you'd absolutely want to have only in one and not the other. These days things are different and in many cases it's okay to have the same file meet both needs. That is, unless your shell double invokes your initialization script(s); then you're just wasting resources.

My solution came about because I wanted uniform functionality across the various platforms I use (Darwin/OS X, FreeBSD, OpenBSD, Solaris, HP-UX, and various flavors of Linux) as well as what ever shells I may be forced to use . I prefer ZSH and/or KSH93, but am often stuck with BASH, DASH, ASH, or PDKSH. Rarely am I lucky enough to have MKSH. So I created the following:
#Title: N/A
#Author:    G. Clifford Williams
#Purpose:   used as a .shrc or .profile this script initializes interactive
#           shell sessions with the specified configuration determined by
#           Operating System and/or shell implementation. In particular this
#           script is for shells that implement the POSIX
#           This Part

#-----------------------------------------------------------------------------#
#------------------------------------INIT-------------------------------------#
#-----------------------------------------------------------------------------#
    #This section checks to see whether the file has been read already. If so
    #it exits (using the return command) with a nice message. It works by
    #setting a variable named for the pid of the current shell ($$). If the
    #variable is not empty the the file has been processed before.
    #this makes one file suitable for both .provile and .shrc use
[ "$(eval "echo \${my_${$}}")" = "processed" ] && \
        { echo "already read for my_${$}"; return 1;}

eval "my_${$}=processed"

#-----------------------------------------------------------------------------#
#-------------------------------RUN MODE CHECK--------------------------------#
#-----------------------------------------------------------------------------#
case "$-" in
    #This section checks for the 'interactive mode' flag in the '$-' variable
    #By checking the my_INTERACTIVE variable you can set chunks of code to
    #be conditional on the manner in which the shell was invoked

    *i* )
            my_INTERACTIVE="yes"
            ;;
    * )
            my_INTERACTIVE="no"
            ;;
esac

#-----------------------------------------------------------------------------#
#------------------------------------FUNCTIONS--------------------------------#
#-----------------------------------------------------------------------------#
my_lecho(){
    [ -n my_SILENT ] || echo "$(date +%H:%M:%S)|$@"
}

my_pathadd(){
    my_tmp1=$1
    shift
    case $my_tmp1 in
        LUA_PATH)
            my_OFS=";"
            ;;
        *)
            my_OFS=":"
            ;;
    esac

    for PATH_add in $@; do
        if eval "[ -n \"\${$my_tmp1}\" ]" ; then
            if [ -d ${PATH_add} ] ; then
                eval "$my_tmp1=\"\${$my_tmp1}${my_OFS}${PATH_add}\""
            else
                echo "path ($PATH_add) not valid"
            fi
        else
            eval "$my_tmp1=$PATH_add"
        fi
    done
}

my_cleanpath(){
    #function to set a very basic PATH
    PATH=/bin:/usr/bin:/sbin:/usr/sbin
}

my_getshell(){
    ps -p $$ |\
        awk '/sh/ {
            for (i=1;i<=NF;i++){                 if (match($i,/[[:alnum:]]*sh/)){                     sub(/\)/,"",$i);                     print substr($i,RSTART);                     exit;                 }             }         }' } #-----------------------------------------------------------------------------# #-------------------Universal/Generic Settings--------------------------------# #-----------------------------------------------------------------------------# #------Get our SHELL-----# [ -n $my_SHELL ] && my_SHELL=$(my_getshell) #-----Get our OS-----# my_OS=$(uname)  #----Get our username----# #my_usrname=$(whoami) my_USERNAME=$(id -u) #------Set EDITOR(s)-----# if { which vim 2> /dev/null  1> /dev/null ;}; then
    EDITOR=vim
else
    EDITOR=vi
fi
FCEDIT=$EDITOR
HISTEDIT=$EDITOR
export EDITOR
export FCEDIT
export HISTEDIT
#------Set EDITOR-----#
if { which less 2> /dev/null 1> /dev/null;}; then
    PAGER=less
else
    PAGER=more
fi
export PAGER

my_FULLHOSTNAME=$(hostname)
my_HOST=${my_FULLHOSTNAME%%.*}
my_DOMAIN=${my_FULLHOSTNAME#*.}
#HISTFILE=${HOME}/.sh_history
my_NEWLINE="
"
#SSH AGENT STUFF
#sagent.sh -T && . ~/.sagent_info || { sagent.sh -s && . ~/.sagent_info ; }

case ${my_OS:-unset} in
    Darwin )
        #-------OS X Specifics-------#
        my_pathadd PATH /usr/pkg/bin
        # MacPorts Installer addition on 2009-01-19_at_01:36:04: adding an appropriate PATH variable for use with MacPorts.
        export PATH=/opt/local/bin:/opt/local/sbin:$PATH
        # Finished adapting your PATH environment variable for use with MacPorts.
        # MacPorts Installer addition on 2009-01-19_at_01:36:04: adding an appropriate MANPATH variable for use with MacPorts.
        export MANPATH=/opt/local/share/man:$MANPATH:/usr/pkg/man
        # Finished adapting your MANPATH environment variable for use with MacPorts.
        export LC_CTYPE=en_US.UTF-8
        ;;
    FreeBSD )
        #-------FreeBSD Specifics-------#
        BLOCKSIZE=K
        export BLOCKSIZE
        my_cleanpath
        my_pathadd PATH /usr/local/bin /usr/local/sbin
        ;;
    Linux )
        #-------Linux Specifics-------#
        my_cleanpath
        my_pathadd PATH /usr/local/bin /usr/local/sbin
        ;;
    CYGWIN_NT-5.1 )
        #-------CygWin Specifics-------#
        CYGWIN=tty ; export CYGWIN
        ;;
esac

case ${my_SHELL:-unset} in
    zsh )
        #--------Z SHELL--------#
        my_lecho "initializing ZSH"
        PATH=$PATH:${HOME}/scripts
        # Lines configured by zsh-newuser-install
        HISTFILE=~/.zsh_history
        HISTSIZE=2500
        SAVEHIST=1000000
        setopt appendhistory notify
        bindkey -v
        # End of lines configured by zsh-newuser-install
        # The following lines were added by compinstall
        zstyle :compinstall filename '/cygdrive/c/.zshrc'
        autoload -Uz compinit
        compinit
        # End of lines added by compinstall
        PS1="[%n@%m:%/>${my_NEWLINE}%# "
        ENV=${HOME}/.zshrc
        export ENV
        ;;
    bash )
        #--------Bourne Again SHELL--------#
        my_lecho "initializing BASH"
        set -o vi #vi mode editing
        set -b #immediate background job reporting
        set -B #brace expansion
        BASH_ENV=${HOME}/.bashrc
        export BASH_ENV
        #source the BASH_ENV if it's readable
        [ -r ${BASH_ENV} ] && . ${BASH_ENV}
        HISTFILE=${HOME}/.bash_history
        HISTSIZE=2500
        HISTFILESIZE=100000
        PS1="[\u@\h:\w>\n\$ "
        ;;
    *ksh )
        #--------Korn SHELL--------#
        my_lecho "initializing KSH (or something pretending to be it)"
        set -o vi #vi mode
        set -o bgnice #nice background processes
        set -b #immediate background job reporting
        ENV=${HOME}/.kshrc
        export ENV
        HISTFILE=${HOME}/.ksh_history
        HISTSIZE=2500
        HISTFILESIZE=100000
        PS1='$(whoami)@$(hostname -s):$(pwd)> '
        case $(id -u) in
            0 ) PS1="${PS1}${my_NEWLINE}# ";;
            * ) PS1="${PS1}${my_NEWLINE}$ ";;
        esac
        ;;
    * )
        #--------GENERIC SHELL--------#
        my_lecho "initializing unknown shell"
        set -o vi
        HISTFILE=${HOME}/.sh_history
        HISTSIZE=2500
        HISTFILESIZE=100000
        ENV=${HOME}/.shrc
        export ENV
        #PS1='$(whoami)@$(hostname -s):$(pwd)>'
        PS1="$my_USERNAME@$my_HOST> "
        ;;
esac

#-------After all is said and done-------#
my_pathadd PATH ~/bin ~/scripts ~/.bin

#-------Domain specific RC-------#
[ -r ${HOME}/.shrc_${my_DOMAIN} ]  && .  ${HOME}/.shrc_${my_DOMAIN}
#-------HOST specific RC-------#
[ -r ${HOME}/.shrc_${my_HOST} ]  && .  ${HOME}/.shrc_${my_HOST}

I tried to remove most of the customizations that are specific to what I do and might not be of use to other users.

Some of the cool things it does:

  • You might notice is that there is built-in invocation checking to help prevent multiple sourcing of the file. In other words if your shell automatically reads both .profile and .shrc when it starts, there is a facility in the above to make sure that it's read only once in cases where it's used as both your .shrc and .profile.

  • There's a handy function to add paths to *PATH variables. It actually checks to see whether the given path(s) exist and will only add valid directories to the path variable given.

  • There's a function to find out what shell you're running ($SHELL won't always give you what you expect)

  • There's an interactivity check to determine whether the shell was invoked interactively or via a script. This is handy for many reasons.

  • There's no non-standard code here. I've only used POSIX facilities here. The only non-shell piece of the whole thing is where i call awk to parse the output of ps -p $$ in my_getshell(). The use of AWK there is generic enough that it should work on all platforms.

I'll probably put together a todo list as I think of other features to add. I'm currently using this as my .profile/.bash_profile/.zsh_profile and .shrc/.kshrc/.bashrc/.zshrc on several machines.

UPDATE: I've had the above profile/shrc available online for a while now... you can track my changes to it and other "dot-files" at http://git.secution.com/cgit/dotfiles

Saturday, November 7, 2009

Shell tricks: eval helps make dynamic scripting rock.

If you write shell scripts of any significant size, eval is a very good command to understand (well). It can make some code simpler, other code obsolete, and yet other code it makes possible that would not be otherwise.

What is eval?


According to the 2004 IEEE Std. 1003.1:

  • eval constructs commands by concatenating arguments.


The example given on the site is pretty simple but not very revealing.

What can I do with eval?


In practice eval lets you (better) write programs that write programs, or at least parts of programs. One common use of eval is to create variables with names relative to their execution environment at runtime. I won't go into all the reasons that this might be beneficial but some good ones are: tying the process to a user id it prevent file clobbering, process accounting based on parent process id, to avoid creating arrays or other complex data structures  (good if you want your scripts to be portable and conform to the POSIX standard).

Enough theory! How do I use eval?


Okay here's an example of how to use eval. Sometimes in a script you want to be able to name a variable dynamically. The following function does just that:
tackon(){
#simple function to tack arg2 ($2) on to the end of
#variable X (arg1)
error_msg="usage: tackon "
eval "${1:?${error_msg}}=\"\${$1} ${2:?${error_msg}}\""
}

This function is named tackon because it takes two arguments and appends arg2 to the end of (the value) of arg1. Here's an example of how it works:
gcw@gcwmbp:/Users/gcw>
$ students="Alex Mark Mike"
gcw@gcwmbp:/Users/gcw>
$ tackon students "Al Tim Steve"
gcw@gcwmbp:/Users/gcw>
$ echo $students
Alex Mark Mike Al Tim Steve

To understand what's going on you first have to know that arguments to the function are assigned to two variables named $1 and $2. When tackon() is called it takes whatever string is passed in as $1 and turns it into a variable. The diagnostics in the above version may be somewhat confusing so here's the function again stripped down to the bare essentials:
tackon(){
eval "${1}=\"\${$1} ${2}\""
}

Now we can more easily see how this works. Everywhere that we have $1 and ${1} it gets evaluated to the first argument. If we called our example, where we called tackon with "students" as the first argument it becomes "students=\"${students} ${2}\"". The quotes and other special characters may neeed to be escaped (based on what behavior you want).

From this point on there's no magic. Eval executes the interpolated command and you benefit from the heavy lifting of a very nifty command. There are many other cool ways to use eval and maybe I'll go over some of them in the future but for now I'll leave you to fiddle around with this.

Saturday, October 24, 2009

why I don't write BASH scripts

I'm constantly hounded by GNU/Linux newbies GNUbies about many things relative to scripting; Why don't I use gawk for my awk scripts; Why don't I use '[[' instead of '[' for my condition tests in shell scripts; Why don't I use BASH for my shell scripts?. Such questions rarely come from seasoned UNIX professionals who have worked on multiple flavors of Unix. Explaining the way things work in the real world requires that the audience has an attention span beyond that of the average I'm_pissed_at_society_and_microsoft_sucks-linux-neophyte.

Clarification:


I'm not picking on Linux here. While I'm a BSD guy (B+++(++++)). I greatly understand and respect the contributions made by the GNU/Linux community from which we all benefit. There are many religious wars to be fought in computing and I'm not, here, interested in fueling any such conflicts. The bottom line is that due to its popularity, there are plenty of idiots who think they know everything using Linux. A good and smart linux user is an asset to the unix community as a whole.

The Reasons


It's not installed on my system(s)


While I've been using Linux since ~ 1994, I don't consider myself a "Linux Guy". Its a matter of preference, not experience. I simply prefer to use other operating systems over most linux distributions. As a result its quite frequently the case that BASH is not installed by default. Such is the case for FreeBSD and OpenBSD.

The same could be said of any "enhanced" shell. KSH93 and ZSH are also not installed by default on those systems. All three are available on my Mac, by default.

It didn't do what I wanted it to do


I've been using the korn shell for well over 15 years. When I wanted features beyond the POSIX standard for shell scripting, KSH93 had them. For at least the past 10 years I've been able to do things that are just now available in BASH 4.0. It makes no sense to me to switch to a much less mature code base just to get the same features that are available in the software I've been using for 10+ years.

It still doesn't do what I want it to do


There was much hype surrounding the release of BASH 4.0. Were I a Linux (only) user I'd, likely, have been right there with the crowd cheering the newest version of the shell that still doesn't do everything I can do with my other shells.

To be fair BASH 4.0 and ZSH do have one feature that KSH93 does not: the ability to generate sequences ({001..199}) with leading zeroes.

Poorformance


Bash just doesn't stack up when it comes to performance. The performance of KSH93 is comparable to that of Perl and Python. The performance of BASH (and many other shells like ZSH, MKSH, OKSH, POSH, ASH etc..) is comparatively poor.

Here is a mandelbrot script that I found online somewhere :
inmandelbrot() {
let "mag = $1 * $1 + $2 * $2"
if [ $mag -gt "40000" ] || [ $5 -ge $6 ]; then
echo $5
else
let "r = ($1 * $1)/100 - ($2 * $2)/100 + $3"
let "i = ($1 * $2)/100 * 2 + $4"
let "cnt = $5 + 1"
inmandelbrot r i $3 $4 $cnt $6
fi
}

for y in {-20..20}; do
for x in {-20..20}; do
let "rval = x * 10"
let "ival = y * 10"
val=$(inmandelbrot rval ival rval ival 1 10)
if [ $val -eq 10 ]; then
echo -n ".";
else
echo -n $val;
fi
done
echo
done

I modified it use the {x..y} sequence instead of calling seq(1) as seq is not readily available on non-gnu systems and it improves execution time. The results were pretty stark ksh93 ran the set in under a second while BASH took 6 seconds on my laptop.
[gcw@gcwmbp:~/prog/fractals>
$ time ksh ./mandelbrot.sh > /dev/null

real 0m0.782s
user 0m0.640s
sys 0m0.016s

$ time bash ./mandelbrot.sh > /dev/null

real 0m7.204s
user 0m1.791s
sys 0m2.207s

In (at least) this case KSH93 out performs BASH by roughly 9 times. In the spirit of full disclosure I should note that MKSH and ZSH don't fare much better than BASH when this type of Mandlebrot set. I got times around 90 - 95 percent of those for BASH.

POSIX -> KSH93 -> ZSH


In most of my scripting I try to stick to the POSIX standard for features and tools. I try to avoid using GNU/Linux (only) utilities to make everything as portable as possible. When I do venture beyond the borders of POSIX-Land, I go to KSH93, I've been doing so for so long that it doesn't make sense for me to do anything else, unless... well you see... there is ZSH. The things you can do with ZSH go pretty far beyond standard "shell scripting". While it doesn't have the performance of KSH93 it does do much more than BASH and other "standard" shells and still manages to out perform them (in many cases).

ZSH does have it's peculiarities. It's close to the other shells at the basic level and then diverges significantly from there.

In closing


Yes BASH is a big a bloated piece of useful software, so too are KSH93 and ZSH. When I started scripting in Korn Shell it was because that's what was available to me at the time. Fortunately, KSH93 has some big advantages over the competition when it comes to features and performance. ZSH has it mainly in features. Because I use both of these shell there's really no reason for me to bother with BASH. If you're a Linux only type of person, you may be blissfully unaware that other shells exist and might not care at all that more power is out there just waiting to be wielded by the likes of you. That's fine. I'm not looking to discourage any would be BASH-Hackers. I'm just trying to get wannabe's off my back for not being one myself.

Monday, October 19, 2009

15" MacBook Pro battery is failing after just 9.5 months (damn you Apple)

I'm not an Apple loyalist. I don't think they make either the best or the most innovative products. I do like that they've managed to do certain things that other vendors haven't been apt or adept enough to do. Yes; Apple does incessantly boast about the most minute of accomplishments. This makes it difficult for the layperson to determine which of their marketing fluff points is actually worth all of the hype. Hype is what Apple wants   you to be interested in. They don't even hide it (well).

Recently Apple's been hyping their Laptops and the great battery life. Well today After 9.5 months the battery in my (new Aluminum Uni-Body) 15" MacBook Pro is crashing. By crashing I mean that it's only lasting for about 1/3 of the advertised 5 - 6 hours of  life on a full charge.

It started last week when I'd noticed on the train (during my morning commute) that my battery meter was atypically low. When I got to the office I plugged in my laptop and left it charging for about 9 hours straight. At the end of the work day I left, hoped on the train and noticed that the estimated life on the meter was 1 hour 47 minutes. I decided to deal with it for the rest of the week. On Friday I let the battery drain and did the calibration they recommend on the apple support site. Saturday (after the 8 - 9 hour process) I got the same results as before 1 hour 47 minutes of battery life. So I decided to try it again. No go, still only 1 hour 47 minutes. Now I can either choose to either try to get by online 1 hour 47 minutes at a time or fork over $130 for a new battery hoping that it will live up to the hype.

Sunday, September 13, 2009

Snow Leopard is pretty "cool" but still doesn't have some things I'd like to see

I've been using Snow Leopard for over a week now. I'm quite pleased with the performance boost (it actually makes the first core2 based MacMinis usable) and don't really mind the absence of feature creep/bloat. It looks like this was a targeted effort and Apple seems to have hit the mark quite aptly.

Unfortunately there are some things that still aren't quite right for me with OS X that I'd hoped would have been addressed with the bring forth of this frosted feline.

Mail.app:



  • Multiple identities - This is long overdue. Apple back has welts from how often they pat themselves there for innovation and usability. Some how they have overlooked the basic fact that many people use email with a basic many to one configuration. I have email from over 50 addresses pointing to one email account where everything is sorted and stored. Unfortunately I have no way of creating different sender profiles/identities. NOTE: I'm aware of the ability to use comma separated values in the email address field for a selectable list of From: addresses but that doesn't affect the signature, name or other identity information.

  • Address Affinity - Apple's Mail.app allows you to specify multiple sender addresses by simply populating the Email address field with each address you want to use separated by commas. Unfortunately when replying to incoming messages there are no checks against said list of addresses. Thus if your Email address field contains "you@x.dom, you@xy.dom, you@xyz.dom" and receive a message addressed to you@xyz.dom, hitting reply won't sent the appropriate From: address. Instead it will just use the first entry in the list of comma separated values.


Stacks:



  • Smart Folders - It would be great to be able to create a doc stack form smart folders.

  • Resize - Resizing grid view would be nice (I'm glad we can at least scroll now).

  • Custom Icon - Setting an Icon/image for you stacks would be a really nice thing to be able to do.


App Switching:



  • Specific Windows - Using the Command Key + Tab will let you switch from one app the the next. Once you're in the appropriate app you can choose a specific window with the Command Key + back tick (`) or tilde (~). It would be great if we could just switch to a specific window with something like the Option Key + Tab. I know there are add-ons that provide this functionality but it's long overdue for inclusion as a core feature of the core interface.

  • X11 Apps - Aqua treats all X apps as separate windows of the one X11 process. So if you're running xterm and xcalc at the same time Command + (`) will cycle through all windows for both applications indiscriminately. Apple should start treating X11 windows as first class citizens.


Command line:



  • Bash version - It would have been nice to see bash updated to version 4.x. I'm not a bash fan ( I prefer ksh93 for scripting and zsh for interactive use because they are cleaner, lighter, faster and more robust) but just makes sense to at least have it included as an option. They could have installed it as /bin/bash4.

  • Vim version - still on 7.2.108 could have been updated.

  • Terminal.app - It had some problems with color overlaying that should be addressed. Light Black over Black doesn't come out right (at least not on my systems especially in mutt). It works fine with Putty, xterm, rxvt, and Iterm.


Overall I'm pleased with the updates in Snow Leopard. The boost in performance is quite welcome. Hopefully the next update has some more substance to it.

Friday, July 24, 2009

WebOS 1.1.0 released (oooh yay...sort of)

The good folks over at Palm have seen fit to bless the world with a significant update to their Web Operating System. There are plenty of goodies in this update  for corporate users and according to PreCentral.net, plenty of secret (??) goodies for everyone else as well. All of this is good news for the Palm devout. There are, however, a few things that are not in this much lauded upgrade that have been over due since 1.0.2.

  • generic copy and paste -- I should be able to select text from a text message, IM, e-mail message, or web page and paste it into any of the aforementioned without being limited to "editable text fields" of a web form

  • password retention/auto-fill -- I still need to re-enter my password for websites when the device is rebooted. This is exceedingly vexing with good secure passwords that are more than 18 characters long

  • Integration with other information services -- Facebook is nice and it was a smart move on the part of Palm to get that working well. If they were concerned with real synergy they would allow us to integrate information from OpenSocial or Naymz, LinkedIn, MySpace, and others.

  • IMAP 'idle' support for subfolders -- It's great to get IMAP "push" for email in my mail box. Now let me get it on the subfolders that I've marked as favorites. Waiting for the sync of 12 different mailboxen is not fun when you have to go though them one at a time and each receives several tens of messages an hour.

  • Resolution of the iTunes Sync Issue -- The new version of WebOS manages to re-enable iTunes sync compatibility for version 8.2.1. Great news for those who are inept enough to let iTunes manage all of their media exclusively. For the rest of us this is almost a non-issue. I say almost because the real issue is this sophomoric game they are playing with Apple. Okay, so the lead engineers of the first generation iPhone are now working on the Pre score 1 Palm. They launched a device that's not blessed by Apple and it syncs with iTunes, score 2 Palm. Apple insinuates that they may consider certain technologies in the Palm Pre to be infringing on their IP rights, score 1 apple. Apple disables iTunes sync with the Pre, score 2 Apple. Palm re-enables iTunes Sync with WebOS 1.1.0. It's starting to feel like a high school "who can be the biggest jack-ass" contest. You can only win by loosing. Apple can just play defense and ride this out while Palm will be left with egg on their face if they don't buck up and figure out a way to cut the crap.

Saturday, July 11, 2009

CAWKLib

CAWK (Cawk's Another Web Kit) is a web framework written in AWK that is soon to be under development. When we started working on it we discovered that we needed a better way to manage file inclusions than the '-f' command line option. We decided to write our own pre-processor and package it with the custom libraries that we were developing.  That project grew into two separate projects. The first being BangCAWK (so named for the 'shebang' line that it replaces), the pre-processor, the second being CAWKLib, the library of functions.

Writing code that others may use is vexing and trying On the one hand you'd like for others to be able to use your code if it can help them out. On the other hand you've now got to now make it better documented, more readable, and robust so that people who decide to pick it up don't either shoot themselves in the foot or want to shoot you (in the head).

We're trying to balance the development of the features that we need with implementing things in a logical and consistent manner for people who may choose to reuse what we produce. I don't know how well we're doing that but you can decide for you self by checking it out @ http://git.secution.com/cgit/CAWKLib

Saturday, June 27, 2009

LUA_PATH on OSX

Earlier today I was playing around with Lua and luarocks on my laptop (which happens to be a MacBook Pro). I thought maybe I'd try re-writing telechat in Lua with the LuaSocket package. After I'd installed Lua 5.1.4 via mac ports along with luarocks:
$ sudo port -v install lua luarocks

I installed LuaSocket.
$ sudo luarocks install luasocket

I was under the mistaken impression that when I tried to run some example code, the ENVIRONMENT would be setup and functioning properly. Unfortunately this was not the case:
$ lua
Lua 5.1.4  Copyright (C) 1994-2008 Lua.org, PUC-Rio

> require "sockets"
/opt/local/share/lua/5.1/luarocks/require.lua:256: module 'sockets' not found:
        no field package.preload['sockets']
        no file './sockets.lua'
        no file '/opt/local/share/lua/5.1/sockets.lua'
        no file '/opt/local/share/lua/5.1/sockets/init.lua'
        no file '/opt/local/lib/lua/5.1/sockets.lua'
        no file '/opt/local/lib/lua/5.1/sockets/init.lua'
        no file './sockets.so'
        no file '/opt/local/lib/lua/5.1/sockets.so'
        no file '/opt/local/lib/lua/5.1/loadall.so'
stack traceback:
        [C]: in function 'plain_require'
        /opt/local/share/lua/5.1/luarocks/require.lua:256: in function 'require'
        stdin:1: in main chunk
        [C]: ?

After some investigation I'd realized that the LUA_PATH wasn't set up by the packages that were installed. Since I'd installed the LuaSocket package as the administrator (with the sudo command). I needed to include the system-wide path to the rocks libraries. I did this for my user only by editing my .kshrc file. If  I'd wanted to do the same thing in bash I could have used .bshrc or .bashrc. To do it for every user of the system I could have use /etc/profile and /etc/login.  In any case I put in this line:
export LUA_PATH="/opt/local/share/lua/5.1//?.lua;/opt/local/lib/luarocks/?.lua;;"

After that was included everything ran fine (don't forget to reload your environment.

Adventures in Lua (I)

After studying and working with Lua for about a year now, I find it to be a very powerful and innovative language. While the syntax is not my favorite, overall it's fun to work with. Recently I started using the standalone lua interpreter for some general purpose scripting which has revealed to me a few key reasons that Lua (the language) or at least lua (the interpreter) won't be readily employed for system scripting on a large scale anytime soon.

  • GetOpts - Doesn't exist. Which isn't really a big deal for me.. sure the (korn) shell has it but plenty of other scripting languages don't have it (like AWK)

  • DELIMITERS - in the LUA_PATH environment variable the delimiters are semicolons(;) as opposed to the colons(:) used in most posix apps.

  • RegEx - Lua doesn't have the Posix Regular Expressions Engine nor GNU's or Perls. There are some very innovative constructs for complex (and simple) pattern matching and an equivilent to back references called yank but it's going to take you some time to get used to if you're accustom to regular Unix REs.

  • ARGV[] & ARGC - They don't exist... well not as you might expect them to anyway. There is an arg or args table created with the values passed to the script as command line arguments, however I was told that args is deprecated or will be shortly.

  • COMMENTS - Okay this one is really a minor issue. If you're used to SQL comments as opposed to those in the shell, AWK, Perl, bc, etc.. then you're in good shape. In Lua Comments begin with a double hyphen or dash (--)


As I said before, I really enjoy working with Lua. The systems engineer/hacker in me wants to find as many ways to utilize it as possible. (Yes, I'm crazy like that.) We'll see if I've reached any new conclusions after I've done more applications programming with lua.

Saturday, June 20, 2009

WebOS 1.0.3 available for the Pre but no new Apps.

Earlier today (okay technically it was yesterday) Palm made version 1.0.3 of WebOS available for Pre users. While I'm sure it may have many goodies to increase performance or reliability. The one thing it most certainly doesn't have is a way to magically make Palm get off it's ass and release their damn SDK and get more apps in the App Catalog.

So far I've been relatively pleased with the Pre. There are some sticking points which I presume will be addressed in fairly short order. The thing that's made me most want to chuck my little "river stone" into the river is the fact that there are only (as of this writing) 30 apps available for download from the App Store.. I mean App Catalogue err uhh Catalog.

What are they Thinking?


SmartPhones and mobile internet devices live and die by the applications that run on them. There is no one who should understand this better than the people at Palm. They basically created the SmartPhone with the HandSpring VisorPhone in 2000 and before that they helped create the mobile applications market with PalmOS running on the Palm Pilot 500 and beyond.

What made the Palm Pilot so popular wasn't the ease with this one could pick up Grafiti or the number of stylus taps it took to get to a contact or create an appointment. It was the ability to download new applications and write some of your own.

Seriously release it already


There will likely be a swell of development for the platform once the tools are made available. That is if they don't kill all of their momentum by making us wait too long. They've built a fairly nifty device with a great interface. It's something you want to keep in your hands and play with. The only problem is that there's nothing to keep you occupied while you're holding it.

Friday, June 19, 2009

An AWKing we will go

I've been meaning to setup awkscripting.com for a while. Well it's finally up. There's not much there yet but as I collect resources on awk and continue to write scripts for which most people will call me crazy, it should hopefully grow.

Wednesday, June 17, 2009

Why I'm not doing Plone anymore (for now)...I think

I've been a fan of Plone for 7+ years. I've developed apps based on it and done much consulting related to it. I've contributed monetarily to the project and various members of it's community and as of right now I have no desire to work with it.

While that could change, it's not likely to happen in the next 2 years or so.

The Good


Plone was/is(?) very innovative in the area of Content Management and Web Applications development. It could possibly be the most advanced Content Management System on the market today. The technologies made accessible through/with Plone are amazing and it helps you build powerful multifaceted sites and applications with relative ease.

The Bad


The stack is probably the biggest stumbling block for building applications in Plone. When I'd decided to build a site with Plone (back in 2002), I thought it would be pretty straight forward to get up and running. It was. I thought it would be relatively easy to maintain. It was. I thought upgrading would be a little tricky but easily palatable. It wasn't. Some where around Plone 2.1 they had switched from using "Zope Native" content types to something called "Archetypes" which was supposed to be more flexible and make it easier to build your own custom content types and therefore your own applications. I should have seen the writing on the wall at this point but I was blinded by enthusiasm and ambition.

Plone is an application built on Zope CMF. Zope CMF is a Content Management Framework that runs on the Zope Application Server. The Zope Application Server is written in a combination of Python (for flexibility and rapid development) and C (for optimization and speed).

One of the first problems I'd come across was migrating all of my content from Zope "native" Content Types to Archetypes based Content Types. Then updating/porting my programs to do the same. This was an annoyance at the time but I figured it was well worth the headache as it was making me more familiar with the underlying technologies. My Python hacking was getting better, I'd picked up a little JavaScript, and I could sort of understand the transactional database used to power the applications.

As time went on I got more and more confident in my Zope/Plone hacking abilities because I'd been working on my sites and applications. When I say "working on", I really mean "working on". Methods and functions were constantly being deprecated, if not by the Plone Core Dev. Team then by CMF or by Zope or possibly rearranged by the new version of Python. Between the 2.1 and 2.5 versions of Plone, I'd done no fewer than 12 updates to one application just to address deprecations. Then there's the fact that over the course of 4 years we'd gone from Zope Packages to Zope Applications, to Plone Applications, to Python Eggs as mechanisms for getting 3rd party software installed on a Plone based site.

The Ugly


Many things are changing which I believe could be a great thing in the long run. In the meantime it requires you to be diligent on a number of fronts. Zope Corp. has released Zope3 which has technologies that are being back-ported to Zope 2 (the version that Plone is built on). Those technologies are making their way into Plone and eventually Plone will/should/may run on Zope3. Python 3 is out. Zope may or may not support this new version of Python, which is not backward compatible with previous versions. All of this spells a bunch of heartache in trying to maintain applications written for Plone.

Hope


The community has been great. The list of people to whom I owe a great debt of gratitude for help on things from the minute to the momentous includes

  • Andy McKay

  • Mikko Ohtamaa

  • Alex Clark

  • Ross Patterson

  • Alexander Limi

  • Fabrizio Reale

  • Steve & Donna (at csquaredtech.com)

  • Lorenzo Musizza


That's just to name a few. There are really smart people with great ideas working on Plone. Development is moving at a very quick pace and I think it will get to a point where the product is once again the write solution for many problems. I eagerly await this day. Until the underlying stack is more stable and there are fewer new things that break older ones.. I'll be stearing clear.

Monday, June 15, 2009

Bah hum bug!!!!

I've tried many other engines (SimplBlog, BBlog, S9y, Quills, Plone) and now here I am running on Word Press. I'm not a fan of  Word Press but it does have the advantage of having a very active community and many plugins that provide the functionality that I would otherwise have to build myself. Hopefully having gotten this site/system setup, I can focus on documentation and development as opposed to tweaking and break/fix issues.... we'll see.

Tuesday, June 9, 2009

Pre mature infatuation


Okay so I got my Pre and everyone has been asking me what I think about it. The short answer is "I like it". Those who know me would suspect that there's much more to it than just that. So here is the skinny....

Continue reading "Pre mature infatuation"

Thursday, May 14, 2009

FreeBSD + Apache + SSL + IfDefine SSL


On FreeBSD systems when you install Apache 2.x and the ssl module you'll see an entry in the httpd.conf that says:







LoadModule ssl_module libexec/apache2/mod_ssl.so











As well as other lines and you may be wondering "how the hell do I define SSL?".



Continue reading "FreeBSD + Apache + SSL + IfDefine SSL"

Saturday, May 2, 2009

Python, AWK, Lua Fractal...


I was hanging out in #lua on freenode and someone had mentioned a performance comparison of several languages on some site using a fractal. Lua's numbers looked really good. Yes Python kicks the crap out of Perl and Ruby. This sort of thing makes me evaluate the practicality of acquiring skills in a new language relative to those that I already use. Benchmarks like this don't really represent "real world" general use but they are not entirely ineffectual when it comes to looking at the merits of various languages and implementations.




Java has more coding overhead than languages like Python, Perl, and Ruby but has various performance characteristics that make it attractive, especially for long running processes. Today there's a strong trend toward developing in languages that are dynamic and perform poorly (relatively speaking) with the expectation that hardware can always be added to solve problems in the future. in general I reject this approach to application development. I enjoy coding in Python but also like getting relatively good performance. For this reason Lua has caught my fancy. The coding overhead is relatively low and the performance is relatively high. It's got one of the best balances of the two for my money. AWK is nice too.



Continue reading "Python, AWK, Lua Fractal..."

Friday, May 1, 2009

TinyDNS (DJBDNS) + ssh for replication


This is a quick script I wrote to manage the replication of my NS records from the primary to the secondary server(s).  You'll first need to setup the keys for it to work and it runs as a cron task. 



Continue reading "TinyDNS (DJBDNS) + ssh for replication"

Thursday, April 30, 2009

etc on git


Well, I'm finally getting around to putting my 30 or so servers on version control. It's a slight pain in the add but well worth it. In the past I'd made use of rsnapshot which I generally liked. This go-round however, I'm using git. 




reasons:










  1. setup is easier



  2. repositories are more compact



  3. trivial restoration



  4. built in diff/stat tools



  5. ease of maintaining local and remote repositories 






 


Sunday, April 19, 2009

Sputnik + Git


Sputnik is a CMS that behaves as a wiki (out of the box) and has a pretty cool back end storage mechanism that supports databases, flat files, and git repositories. By using git I can work "offline" on a copy of the sites powered by sputnik and just sync the changes back when I'm ready. Since I don't allow ssh access to the production servers running sputnik, I've set up a scheme whereby I can sync changes from my laptop up to the VC server. The sputnik server will pull any changes down automatically and push any changes up as well.




Here's the script that does it. 







I wrote this in a quick and dirty fashion and should really clean it up before letting it loose. Oh well. 











Continue reading "Sputnik + Git"

Saturday, April 18, 2009

sputnik + xavante startup with runit


This is the script I use to manage my instances of xavante which power my sputnik instances. I should really clean it up but I'm frying bigger fishermen at the moment.






Continue reading "sputnik + xavante startup with runit"

Thursday, March 12, 2009

Ohhhh THAT's why I'm fat....

Interesting site about ridiculous eating options. I'm intrigued by the Pizza Cone. Some things are just scary. Just a few:


Wow.. that's about all I can say...
Find your favorites here.

Ohhhh THAT's why I'm fat....

Interesting site about ridiculous eating options. I'm intrigued by the Pizza Cone. Some things are just scary. Just a few:

  • Meat Cake

  • Deep-fried Grilled Cheese

  • 9-decker Filet-O-Fish

  • Blueberry Waffle Breakfast Sandwich

  • Twinki Wiener Sandwich




Wow.. that's about all I can say...

Find your favorites here.

Sunday, March 8, 2009

BlogWare


It's taken me a while to settle on what software I should use to power this site. I'm more of Form and Function type of guy. I like to have control over the say things are stored and more options for content than "new post". It's taken me a while to get over the trade-off of functionality vs complexity. I've settled on a solution that is very simple and elegant at the cost of the functionality that I've grown accustom to. In the end this means I'll spend less time maintaining the site and more time using it and getting other things done, hopefully.



Continue reading "BlogWare"

Saturday, March 7, 2009

Hello World


Okay new site/blog. I'm really not into "blogging" but we'll give this a go.