Javascript Compression — Tools and Process

This article is a review of the tools and processes that I have tested and gives plusses and minuses of each.

Software Uncompressed Compressed Percent Comment
Closure Compiler 39K 16K 59% with ADVANCED_OPTIMIZATIONS
YUI Compressor 39K 22K 44%
perl-Javascript-Minifier 39K 25K 36%


Since CPAN library’s Javascript-Minifier and CSS-Minifier are immediately available linux tools they are a good starting point. The Javascript-Minifier is simple to use. Here is a script that you can try to see how it works:

 use JavaScript::Minifier qw(minify);
 $oFile=~ s/[.]js//;
 open(INFILE, "$iFile") or die;
 open(OUTFILE, ">$oFile") or die;
 minify(input => *INFILE, outfile => *OUTFILE);

In my tests, it didn’t break my code, but did generate errors because of incorrectnesses in my code. I used the google chrome jslint plugin to find the errors. jslint only works on pure javascript, but strings are not parsed. Thus you can use php to initialize variables by putting the php code inside of quotes, and still check it with jslint.


YUI Compressor
The YUI Compressor is Yahoo’s library, and works better than perl-Javascript-Minifier. Here is an example command for using YUI Compressor:

java -jar yuicompressor-2.4.7.jar --type js -o filename_yui_min.js filename.js

A nice feature of the yuicompressor is that it can accept javascript strings from the command line. This makes it simple to script. It’s goal is to not break code, and in my tests this was observed to be true.


Closure Compiler
The google closure compiler is the most advanced of the ones that I tested. It has a simple mode that doesn’t break code and an option for ADVANCED_OPTIMIZATIONS that produces very compressed code. Here is an example command for using the closure compiler in simple mode:

java -jar compiler.jar --js filename.js --js_output_file filename_closure_min.js --externs externs.js

And similarly for advanced mode:

java -jar compiler.jar --compilation_level ADVANCED_OPTIMIZATIONS --js filename.js --js_output_file filename_closure_min.js --externs exterms.js

Similar to perl-Javascript-Minifier, closure compiler only works on pure javascript files. Because of the effectiveness of the optimizations that it does, it can break code. To effectively use it, you need to design your javascript with minification in mind. Typically you want to use your javascript as a library (i.e. as handers for events such as mouse clicks) to do this, you need to add a small amount of code that preserves the function names that will be available to external scripts. Similarly if you want to use external libraries in your library, you need to add extern declarations that will preserve the external symbols. There are less modifications required if you use the simple mode than for the advanced mode. I wanted to use advanced mode for some script that contains jQuery calls (including jQuery Mobile), but wasn’t able to find a way to preserve the jQuery and $ symbols. I tried using –externs with the externs file available as an addon from google svn, but this didn’t solve the problem. Therefore I recommend using simple mode for files containing jQuery and advanced mode for files that do not.


In summary of the tools reviewed google closure compiler is the most effective, perl-Javascript-Minifier is the least likely to break code, and yuicompressor is a compromise between these extremes. Additionally each of these tools can be run locally on your machine.

Process Automation: Customer Relationship Management Software – CRM

One of the modern tools of business automation is customer relationship management software commonly known as CRM software. CRM software provides process automation and a centralized data storage point for all things pertaining to suspects, prospects, and customers.

Services offered by the CRM help businesses to organize, synchronize, and manage business processes with the goal of finding, attracting, and winning new clients and then nurturing and retaining those relationships by providing a system that facilitates communication between the client and various departments such as sales, marketing, and support.  The software often serves as the digital hub of a company wide marketing strategy that places a high value on customers relationships and ties financial data to every step involved with acquiring and servicing a customer.

At Catontech we’ve been working with the open source solution Vtiger CRM and evaluating it for future usage and integration with web presence packages allowing leads generated by online marketing efforts to be stored in the system for future marketing efforts.  You can view the video below to get an idea of how the system looks and feels.

What we like about Vtiger is that it is open source and runs on linux, appache, mysql, and php and can be implemented at a very low cost with a low learning curve and integrates nicely with WordPress.  Vtiger CRM is relatively easy to learn and you can get up to speed in about 1/2 an hour.

We have began testing the use of Vtiger Customer Relationship Management Software on three different projects and we will be reporting on the outcome of those ventures in the near future.

– Joel

Protect our Internet Freedom

Those of you that keep up with me know about my book a month program that I use to keep pace with business and technology trends.  Last months book was The Laws of Disruption: Harnessing the New Forces that Govern Life and Business in the Digital Age by Larry Downes.  In it Larry discusses how technologies change society and how law makers oftentimes clash with technology in an effort to appear to be legislating issues that are identified by elements of the society.  Most of the time lawmakers just throw a monkey wrench into markets that are created by innovation.

The following video portrays the latest government intrusion into a market built on technology that it does not understand.  Please watch this video and take action.  We must stand up and protect our internet freedom or we will lose it.  Increased governmental regulation of the internet could change the internet environment and make web applications as we know them a thing of the past.  I didn’t have any part in creating this video but I agree with it 100%.

PROTECT IP Act Breaks The Internet from Fight for the Future on Vimeo.

Tell Congress not to censor the internet NOW! –

PROTECT-IP is a bill that has been introduced in the Senate and the House and is moving quickly through Congress. It gives the government and corporations the ability to censor the net, in the name of protecting “creativity”. The law would let the government or corporations censor entire sites– they just have to convince a judge that the site is “dedicated to copyright infringement.”

The government has already wrongly shut down sites without any recourse to the site owner. Under this bill, sharing a video with anything copyrighted in it, or what sites like Youtube and Twitter do, would be considered illegal behavior according to this bill.

According to the Congressional Budget Office, this bill would cost us $47 million tax dollars a year — that’s for a fix that won’t work, disrupts the internet, stifles innovation, shuts out diverse voices, and censors the internet. This bill is bad for creativity and does not protect your rights.

Protect our internet freedom.  Express your concern here!!!



















The basic weekly web awareness task list

Building web awareness of your business takes time and consistent effort.  Here is a list of tasks that I work on weekly to insure that people can find Used consistently these methods will build awareness and cause your website to improve in search engine ranking.

  • Write a blog post about a topic relevant to your business.  I recommend using the WordPress blogging platform.
  • Use an rss reader to read other bloggers articles that are relevant to your business and leave comments with a link to your site.  Try to do a minimum of five comments per week.  I recommend using Google Reader to subscribe to other bloggers RSS feed.
  • Use twitter to comment about things you are doing that are relevant to your business.  I recommend using tweet deck and twaitter to organize and automate some of these tasks.
  • Post updates to Google+ and Facebook
  • Answer questions on Yahoo Answers or Linkedin Answers
Over time performing the tasks in this basic to do list will improve your websites ranking in search engines and drive traffic to your site.  It takes about six months to see a return on investment from this investment of time but it is well worth the effort.
What do you do to promote web awareness of your business online?
– Joel


How to know when: comments should be deleted

Tips for quickly determining if comments should be deleted.

Your blog or online publication is a serious investment and the centerpiece of a good inbound marketing program.  You work hard creating content and the payoff occurs when visitors leave their comments.  However there is an insidious element at work on the internet. Spammers are hard at work trying to find ways to trick search engines into thinking that their sites are linked to reputable sites such as yours across the internet. You’ll receive bogus comments on your blog from people trying to create links to sites that they represent.  Serious spammers devise scripts that look for blogs to post generic garbage type comments on that are filled with links to an advertising sites and you can end up getting several of these a day.  This can either give you a false sense of optimism about your readership if you are naive and allow this to take place or it can make your job as content provider tedious when weeding the garbage comments out.  There are several good plugins for handling comments such as Asikmet but they cost money when being used on a business site.  If your on a budget you won’t want to waste your time dealing with these type comments so here’s several quick indicators you can use to quickly and easily determine when comments should be deleted.

When comments should be deleted:

  1. The person leaving the comment didn’t leave their name.  Sometimes you’ll get legitimate commentors that don’t leave their names but if they don’t consider it a strike against them.
  2. The uri in their post (their website) links to a specific product page on a commercial site.
  3. The email address that they posted looks phony.  
  4. The comment isn’t coherent.  
  5. Their comment wasn’t about your article. 
  6. They posted multiple links to advertising sites.
Look for combinations of the things we’ve listed above when reading the comments you receive.  A comment with two or more issues such as the ones we’ve discussed here probably isn’t fit to appear on your site.  With a little practice you’ll become good at skimming comments for these issues and with very little time and effort you’ll know when comments should be deleted.
I’m sure as you continue working on your blog or online publication that you’ll notice more patterns that could be added to this list so feel free to leave comments here on the tips you find.
Legitimate comments are welcome here!!
Joel Caton
Meridian, MS

Catontech Web Presence Package

Introducing the Catontech Web Presence Package

Do you need a web presence for your organization?  Would you like to save money by doing it yourself but you don’t feel you have the time to study all the new technology and gain the required technical expertise?  There is a solution that fits your needs:  The Catontech Web Presence Package.  Let us provide you the tools to give your business an online presence in a cost effective way.  Here’s what the Catontech Web Presence Package offers:


  • Your own domain .com, .net. .us etc.  Example:
  • A cost effective upfront price that will pay for your service for a year!!
  • An easy to use content management system so you can update your site without the hassle of contracting programmers or developers.
  • Guides for do it yourself search engine submission at no cost to you.
  • Coaching on how to promote your site without spending money.
  • Access to free tools you can use to update and maintain your site.
  • Free design of your initial banner and icon.
  • Integration with social media sites such as Twitter and Facebook.
  • Catontech discount club membership.
  • Referral rewards.
– Joel
Meridian, MS

Guidelines for writing great articles

Part of my efforts with the Mississippi Magic Magazine involve setting up guidelines for writing great articles.  If you are wondering why a technologies consultant is posting guidelines on writing great articles then let me explain.  We are in the business of creating success in our constituents that we partner with through consulting and development.  That being said… Mississippi Magic Magazine provides an outlet of expression for people and businesses in Mississippi but it does not employ writers to produce magazine content.  Therefore, all the writing done on the Mississippi Magic Magazine is done by volunteer writers that have a vested interest in the community.  From business owners to pastors of churches, people active in leadership in the community provide all of the content found in magazine.  To make this a profitable venture for each of them, we’ve found it necessary to publish a guidelines for writing great articles.


  • People like to do business with those they are familiar with.   Through connecting with our community through your articles, you’ll lower the barriers that people have to traditional broadcast style advertising communications.  .
  • Articles will be broadcasted to the RSS subscription readership.
  • Articles will be indexed by search engines so that anyone looking the information you provided will find it and you on the internet.
  • Articles will spread in a viral method through through social media sites such as twitter, facebook, and linked in.
  • Unlike traditional media, articles written will be available online well into the indefinite future.
  • Articles will link your thoughts, expertise, and methods to your perceived value as a solutions provider and allow people to reach you.
  • Articles always make the front page when they are published.
Good practices:
  • Offering needed information to make an informed decisions.
  • Offering solutions by showing how to solve a problem.
  • Providing lists of methods or resources.
  • Expounding upon a subject.
  • Offering information about events by featuring information, pictures, and video covering people in the community.
  • Praising and pointing out the good about others.
  • Simplifying complex information, instructions, or events.
  • Slandering others.
  • Defaming others.
  • Advertising your products or services.
  • Soliciting business.
Using these methods, community leaders can provide invaluable content that leads people in the community to them as a provider of services and goods.  Through the relationship formed between the reader and author in the magazine articles these leaders have an inroad to the reader when the reader seeks to find a provider for a need or want that the author can fulfill.
This style of marketing defined in these guidelines for writing great articles are known today as inbound marketing.  By using them you’ll get found on the internet and connect with your customer.
– Joel

Promoting business with local search

Recently I’ve made a strong push at promoting business with local search.  I’ve found that you don’t need to spend a lot of money promoting a business online with local search, social media, and a blog platform.  Here’s a little story in the making about promoting business with local search

An old friend started a cleaning business and approached me about how he could promote it.  He had been approached by a representative from an online business about a subscription that would “list his site at the top of search engines results”.  He wanted to know if what he was told about the service that he had been offered was true.  As we talked I took out my Android powered smart phone and went to the Google voice search.  I asked him what search terms he thought someone needing his services in the Meridian, MS area would need and we proceeded to search for them.  As I expected we found a few competitors listed from Google places but none with strong organic search results for the terms we listed and none with a comprehensive web presence.  I explained to him that the only way that someone could guarantee results in search engines is using a cost per click or cost per impression program through the search engine and that organic links were more effective and could be built for free using an inbound marketing method.

After discussing the matter a little more he agreed to try it so we set up his domain, blog platform, and I’ve lined up the following sites for promoting business with local search to help him.  Listing his business on these sites should promote his business in local search in the coming months.

Let’s see what we can do with an some elbow grease and an online advertising budget of zero dollars and zero cents.  The business name is Clean Machines for those of you who are curious.
– Joel

Excel spreadsheets with PHP – a PHPExcel overview

Writing Excel spreadsheets with PHP using PHPExcel:  The pros and cons

PHPExcel is a php library for reading and writing excel spreadsheets.  It is very well documented and has good community support.  When searching the web for an open source solution for writing spreadsheets in PHP I reviewed everything from PHPClasses to Pear and finally I came across this package.  It’s the most up to date looking project out there for excel spreadsheets with PHP.  I didn’t want to use a project that was in the twilight of its development cycle so I opted to try PHPExcel.  Here’s a a brief overview of what I found:

Pros – The good points:

  • Very well documented.
  • Good examples
  • Support forum with active participation.
  • Supports many of the builtin excel functions.
  • Easy to style
  • Supports modern Excel formats plus csv, html, and pdf
  • Write spreadsheets
  • Read spreadsheets
Cons – what I couldn’t stand:
  • SLOW….  This library takes considerable time to produce a spreadsheet when memory conserving disk caching is used.  Consider running your process in the background.
  • Memory Intensive  PHPExcel offers disk caching to reduce its in memory size, but it basically is an in memory spreadsheet and must load completely into memory while being before being written to file.  This can considerably hinder your performance as the spreadsheet object and writer object can take over 25 mg of ram while the program is running with an additional 1 kb of ram per cell loaded in memory.  For an example of how this can impact your server or hosting service consider the load of a 50 column 10000 row spreadsheet which would take 525 mb of ram to load when writing to file.  While its possible to set the scripts memory high enough to handle this you’ll find that if several of these scripts try to run at the same time you’ll have a mess that could crash your server.
  • Write to file is batched processed at the end of the script and happens all at one time.  Regardless of your disk caching settings the entire spreadsheet is loaded into memory and then written to the file.  This creates a serious bottleneck if you have a large file and will oftentimes crash your script.  The only exception to this is when PHPExcel writes a csv file in which it allows for a more granular approach to memory to file writing.
Workarounds that will reduce memory usage and improve speed.
  • Set a cells value explicitly.  Setting the cells value and data type explicitly is much faster than writing cell values for PHPExcel to determine the data type of.
  • Disable formula precalculation by setting it to false.
  • Use disk caching.
  • Don’t try to use it to create reports from large data sets.
In summary:  If you are trying to read and write to small spreadsheets with less than two hundred and fifty thousand fields or cells then this library will be a good fit for your project.  It’s great for summary reporting!!  However if you’ve got extremely large record sets that you need to send to a spreadsheet you’ll need a different approach.  I’ll be writing about how we did that in the next blog post.
Stay tuned and may the source be with you!!
– Joel

PHP and curl for remote site data extraction

Sometimes you’ll need to get data from another site on the internet.  In this article we’ll go over a quick and easy how to for using php and curl for remote site data extraction.

To get things started we’ll need to initialize a curl object and set some parameters so it can act as a web browser and log into the targeted remote site.

        // Initialize cURL
        $Curl_Obj = curl_init(); 

        // Enable Posting.
        curl_setopt($Curl_Obj, CURLOPT_POST, 1);

        // Enable Cookies
        curl_setopt ($Curl_Obj, CURLOPT_COOKIEJAR, 'cookie.txt'); 

        // Set the browser you will emulate
        $userAgent = 'Mozilla/5.0 (X11; Linux i686; rv:2.0.1) Gecko/20100101 Firefox/4.0.1';
        curl_setopt($Curl_Obj, CURLOPT_USERAGENT, $userAgent);

        // Don't include the header in the output.
        curl_setopt ($Curl_Obj, CURLOPT_HEADER, 0);

        // Allow referer field when following Location redirects.
        curl_setopt($Curl_Obj, CURLOPT_AUTOREFERER, TRUE);

        // Follow server redirects.
        curl_setopt($Curl_Obj, CURLOPT_FOLLOWLOCATION, 1);

        // Return output as string.
        curl_setopt ($Curl_Obj, CURLOPT_RETURNTRANSFER, 1);

You can find a complete list of php curl options at if you need more details.

Now that we’ve got our curl object set up lets post some login credentials to a login script.  To do this we’ll need to get the input ids from the login form and supply them values and setup the post with the curl object.  Then we’ll post the credentials to the script detailed in the action of the login form.  Here’s how we do it:

       // Set up post fields from login form.
       curl_setopt($Curl_Obj, CURLOPT_POSTFIELDS, "UserFieldId=UserName&$PasswordFieldId=$Password&AnotherFieldId=$AnotherValue");

       // Set the url to which the data will be posted.
       curl_setopt ($Curl_Obj, CURLOPT_URL, '');

       // Execute the post and get the output.
       $output = curl_exec ($Curl_Obj);

       // Empty the post fields so you don't re-post on the next request.
       curl_setopt($Curl_Obj, CURLOPT_POSTFIELDS, "");

At this point we’ve obtained the output of the login attempt and we can parse it to see if it was successful using php string manipulation functions.  For the sake of simplicity* I’ll leave that part to you.  Once you’ve insured that you’ve logged in you can proceed to navigate to the next url using the steps below:

       // Set curl object url option.
       curl_setopt ($Curl_Obj, CURLOPT_URL, $Get_This_Url);
       // Execute query and obtain content.
       $output = curl_exec($Curl_Obj);

Finally you’ve downloaded the content from the page that was not accessible prior to your virtual login.  You can parse it with php’s string functions or use xpath to read its content if you’d like.  Don’t forget to close the curl object before you exit your script!!!

// Close curl object.
curl_close ($Curl_Obj);


Happy coding and please use responsibly.

– Joel


* Here’s a simple string comparison function you can use when using php and curl for remote site data extraction.

function extract_snippet($html, $Start_Term, $End_Term, $PrintOutput = false)

        $start = strpos($html,$Start_Term);
        if (!$start === false)
            $end = strpos($html,$End_Term,$start);
            $T_Sel = $start + strlen($Start_Term);
            $snippet = substr($html,$T_Sel,$end-$T_Sel);
            if ($PrintOutput)
                print "\n<br>$snippet</br>";
            return $snippet;
        return false;