PHP Error Handling and Debugging – Part 1

This article describes the process of testing PHP code.  Using the tips that I will explain can help to decrease the code/test cycle time.

The first thing that you must know in order to plan your code/test process is the environment in which your code will run.

If you have full control of the system, less configuration is required. In this case you can rely on the default settings, and simply need to know where the logs are kept by default. On a typical LAMP (apache) system you can find the log files in /var/log/httpd. Check the documentation for the operating system that you use as some operating systems use a different directory (i.e. one version of Ubuntu uses /var/log/apache2). By default error messages from php will be kept in this directory.

If you are developing on a server where you don’t have access to the default logs, you can configure where your log messages are sent by putting a php.ini file containing the directive

error_log: path_to_log

in the root of the domain.

With this information in mind, we can begin to find code errors.

There are two error types to look for:

  1. parse
  2. runtime

A parse error is the first thing to look for when testing new or modified code. This can be something like a missing semicolon or other grammatical mistake. If a parse error occurs, it will be sent to PHP’s error_log. A simple way to find this kind of error is to load the file directly in a browser (i.e. an AJAX script that would not normally run in a browser could be tested this way for parse errors). With a default PHP installation the parse error will shown on the screen.

Most errors that are encountered are runtime errors. There are two kinds of runtime errors:

  1. exception
  2. functional

The first kind of runtime error happens when a statement or function call that is grammatically correct encounters a unexpected circumstance such as an invalid parameter (i.e. fopen(file_that_doesnt_exist,’r’)). This kind of error can only be seen during an actual run of the code with valid inputs. Opening the file in the browser directly usually will not find it as the inputs will not be those that would typically be encountered. For example opening an AJAX script that relys on the _POST variable for its input will typically not run many of the branches because of the missing _POST variables. To find this error, run the script as it would typically be run and check the error log for errors.

A functional runtime error is when the code runs, doesn’t generate an error, but doesn’t produce the expected outputs. To find this error use one or more of the following techniques:

  • echo/printf
  • error_log
  • try/catch

The simplest way to find errors is by adding echo statements to the code. This method can be somewhat tedious and slower to use than others, but a few well placed echo statements that use print_r to show the value of key return data structures can sometimes quickly illuminate the source of the malfunctioning code. The problem with this method is that because it outputs directly to stdout (the web browser) it is only available if the script can be run in the web browser directly with typical inputs. Many times this is not possible (i.e. for AJAX or cron code).

A more general way of debugging is to use the error_log function instead of echo. With the error_log function you can direct the messages to a file of your choosing with

error_log($message,3,"path/filename")

or to the configured error_log mentioned earlier via

error_log($message)

A bonus when using the error_log() function is that you also get a timestamp for each error logged.

If a runtime error is expected, a try/catch statement should be placed to ignore it or otherwise handle it in a way that doesn’t cause the script to stop abruptly.  This way the script will continue to run and an error will be logged.  This is better because you will know at what section of code the error occurred.  If the blocking error had gone uncaught (in the case of AJAX responder script errors), the calling application might have received a malformed response (parse error).  A try/catch statement is only helpful when a blocking exception occurs, and will not help to debug functional runtime errors.  The structure of this type of code testing is as follows:

try {
 //your new code
 } catch(Exception $E) {
 error_log(E.getMessage());
 }

In this article we have discussed simple code/test cycle techniques for PHP.  Tune in next time for part 2 where we will review using a debugger such as XDebug.

Process Automation: Customer Relationship Management Software – CRM

One of the modern tools of business automation is customer relationship management software commonly known as CRM software. CRM software provides process automation and a centralized data storage point for all things pertaining to suspects, prospects, and customers.

Services offered by the CRM help businesses to organize, synchronize, and manage business processes with the goal of finding, attracting, and winning new clients and then nurturing and retaining those relationships by providing a system that facilitates communication between the client and various departments such as sales, marketing, and support.  The software often serves as the digital hub of a company wide marketing strategy that places a high value on customers relationships and ties financial data to every step involved with acquiring and servicing a customer.

At Catontech we’ve been working with the open source solution Vtiger CRM and evaluating it for future usage and integration with web presence packages allowing leads generated by online marketing efforts to be stored in the system for future marketing efforts.  You can view the video below to get an idea of how the system looks and feels.


What we like about Vtiger is that it is open source and runs on linux, appache, mysql, and php and can be implemented at a very low cost with a low learning curve and integrates nicely with WordPress.  Vtiger CRM is relatively easy to learn and you can get up to speed in about 1/2 an hour.

We have began testing the use of Vtiger Customer Relationship Management Software on three different projects and we will be reporting on the outcome of those ventures in the near future.

– Joel

PHP quick data export to spreadsheet

In our last post we talked about the limitations of creating large spreadsheets with PHP library PHPExcel.  Today we’ll discuss a quick workaround that will allow you to create large unformatted spreadsheets for quick and easy data export from your applications.  Here’s how we do a php quick data export to spreadsheet:

  • Properly escape your data.
  • Write data with fcsvput($filehandle, $yourdataarray, “t”);
  • Include table headings and other data.
  • Use a xls file extension.
Properly escape your data... You’ll be using a tab delimited format for output so you’ll need to replace the tabs with an escaped tab or five spaces so that your data doesn’t corrupt your output format.  For our purposes we’ll be using five spaces in the following function.
function TSV_Escape($str)
{
    if (!strlen($str) == 0)
    {
        $str=str_replace('t','     ',$str);
        if (!strstr($str,"n") === false)
        {
            $str=str_replace('n','%n',$str);
        }
        return str_replace('"','""', $str);
    }
    return '';
}

Write you data with fputcsv:  First you’ll need to open a file handle like this…

// Set file path and initialize handle.
$TSVFile = $_SERVER['DOCUMENT_ROOT'] . "/MyDataFile.xls";
$TSVFileHandle = fopen($TSVFile, 'w');
// Write your headers.. See next section :)
// Write your data.
// Write query data to csv.
if ($objdbconn->real_query($sql))
{
    if ($result = $objdbconn->store_result())
    {
        echo date('H:i:s') . "Processing " . $result->num_rows . " records.<br>\n";
        while ($row = $result->fetch_assoc())
        {
            $writeArr = array();
            foreach($row as $value)
            {
                $writeArr[] = TSV_Escape($value);
            }
            fputcsv($TSVFileHandle, $writeArr, "\t");
            unset($writeArr);
        }
    }
}
// Close your file handle.
fclose($TSVFileHandle);

Write your headers and other data:  You may want to include some information about your spreadsheet and you’ll certainly want to include the column header row.  Here’s a brief example of how this could be done:

$headings = str_replace('`', '', "Heading One,`Database Field 1`,`Database Field 2`,`Database Field 3`");
$headArr = explode(',', $headings);

$RepDat[]="Created: " . date('H:i:s d-m-Y');
$RepDat[]="Created by: " . $UserName;
// Write county name to csv file.
fputcsv($TSVFileHandle,$RepDat,"\t");
fputcsv($TSVFileHandle,$headArr,"\t");

Use a xls file extension.  Sure you could use a tsv file extension, but if you want Excel or Open Office to open the file by default with minimal headache the xls extension will do the trick.  You’ll get an message when you open the report stating that the file was not in the same format as the extension, but you won’t have to worry about the tsv file extension being registered to the right application.

This method will kick out a tab delimited spreadsheet in a matter of seconds and can safely handle large record sets.  We used it for a while, as a temporary fix for reporting, until we came up with a better method and we still use this method to create database load files when parsing complex legacy reports or backing up database records to file.

I hope this php quick data export method is helpful.

Joel

My findings on large complex sql statements with mysqli

My findings when developing using large complex sql statements with mysqli in php

The granular approach is more efficient

Sometimes its tempting to execute a large complex sql statements with mysqli to “do it all” in one update but I’ve found this approach to be inefficient in terms of speed and functionality.  My advise it to use smaller and precise sql statements instead of large cumbersome queries. The small statements execute quickly and will get the job done faster then the large complex query. For instance in one case I had a complex dynamically generated query with over 230 update statements that updated a single table and concatenated (CONCAT) text to a text field.  This query typically hung up and took so long to execute that it caused problems in my ajax environment locking tables and creating conflicts with other operations that were trying to read and write using update and select statements.  Once I broke it up into 230 small queries the execution time went down from 30 seconds and failing to 3 to 5 seconds and never failing.  In another case when translating csv files to a database I had an issue with a multi query that hung up on large files but once broke it down into running smaller specific queries it was able to execute up to 10000 queries in a few seconds (no I’m not exaggerating it).

In addition to these tidbits here’s some other findings… Mysql lets you set the priority level of queries and by default update queries have a higher priority than select queries so in theory it isn’t possible to run a select on a table that is in the middle of an update transaction if autocommit is being used (it is always on by default). Mysqli locks the table when performing the update and the select has to wait until it is unlocked. This can become cumbersome and unweildy when performing large complex queries and so far I haven’t had very much success with them.

When it comes to complex sql statements with mysqli achieving granularity through smaller interrelated specific sql statements seems to get far greater results for speed and efficiency.

May the source be with you

– Joel

Missing documentation for using Flexigrid

Recently one of the projects I’ve been working involving database tables with hundreds and thousands of records required a grid to display data.  After doing a little searching I found a jquery project called Flexigrid.  In this article I’ll talk a little about my experiences with Flexigrid and some things I found that weren’t readily available on the internet.

What is Flexigrid and what does it do?

Flexigrid is a lightweight, highly customizable, limited functionality, grid for displaying table data in a spreadsheet styled table.  It offers dynamic ajax loading of xml or json data from a database and displays records in sortable, searchable, and rearrangeable columns that are organized by navigateable pages containing as many records as your user desires to see on his or her screen at one given moment.  It is remarkably easy to begin using because it offers basic functionality examples in php, javascript, and sql.  You can download if from http://flexigrid.info or http://code.google.com/p/flexigrid/ and view discussion on it here http://groups.google.com/group/flexigrid/?pli=1

Issues with Flexigrid:

  • Flexigrid does not work with the latest version of jquery and I had to use version 1.5.2 to get it to work, but thats not really a problem when you consider that jqueryui works just fine with that version.
  • Very limited documentation.
  • No current editing or updating functionality.
  • No mouse over tool tip style information for context sensitive help messages.  Requires customization to add such functionality.
  • Renders differently in different browsers.

My biggest issue was determining how to dynamically load a url.  The documentation offered did not show the correnct method of first updating the url property of the object before calling the reload method as shown here:

$(“#flex1”).flexOptions({‘url’: GetGridURL()});

$(“#flex1”).flexReload();

The flexReload() function accepts a url as an argument but it does not use it so the only way to specify a different url is to call flexOptions first and update it there.

Also I wanted to process the grid’s data and render the row  text color accordingly.  I found some references in the discussion forum and was able to piece together this solution.  If you add the onSuccess : YourFunction parameter to the flexGrid options you can cause it to call a YourFunction each time the grid is successfully loaded.  I added this function to onSuccess :

function StyleRows(){
$("tr").each(function() {
var st = $(this).find("td").eq(0).text();
if (st != "0" && st != "1"){ $(this).css("color", "black");}
else if (st == "1")  { $(this).css("color", "green");}
else if (st == "0")  { $(this).css("color", "red");}
 }); }

This information was not readily available.

I hope by posting these things here that I’ve saved you some time.

– Joel

SQL tables for website development of html forms

A programmers resource: SQL tables for website development of html forms using selects and multiselects.

The following sql statements create tables that you can use to generate html selects and multi selects for web application menus.  I’ve posted some of the most common tables that I think are needed.  Let me know if there is one that I don’t have listed and I’ll add it.  You can copy these sql statements to a file or execute them directly to create the tables.

States:

CREATE TABLE `States` (
`Abr` varchar(2) NOT NULL,
`Name` varchar(56) NOT NULL,
`Sort_Order` smallint(6) NOT NULL,
PRIMARY KEY  (`Abr`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;

INSERT INTO `States` VALUES(‘AL’, ‘Alabama’, 1);
INSERT INTO `States` VALUES(‘AK’, ‘Alaska’, 2);
INSERT INTO `States` VALUES(‘AZ’, ‘Arizona’, 3);
INSERT INTO `States` VALUES(‘AR’, ‘Arkansas’, 4);
INSERT INTO `States` VALUES(‘CA’, ‘California’, 5);
INSERT INTO `States` VALUES(‘CO’, ‘Colorado’, 6);
INSERT INTO `States` VALUES(‘CT’, ‘Connecticut’, 7);
INSERT INTO `States` VALUES(‘DE’, ‘Delaware’, 8);
INSERT INTO `States` VALUES(‘DC’, ‘District Of Columbia’, 9);
INSERT INTO `States` VALUES(‘FL’, ‘Florida’, 10);
INSERT INTO `States` VALUES(‘GA’, ‘Georgia’, 11);
INSERT INTO `States` VALUES(‘HI’, ‘Hawaii’, 12);
INSERT INTO `States` VALUES(‘ID’, ‘Idaho’, 13);
INSERT INTO `States` VALUES(‘IL’, ‘Illinois’, 14);
INSERT INTO `States` VALUES(‘IN’, ‘Indiana’, 15);
INSERT INTO `States` VALUES(‘IA’, ‘Iowa’, 16);
INSERT INTO `States` VALUES(‘KS’, ‘Kansas’, 17);
INSERT INTO `States` VALUES(‘KY’, ‘Kentucky’, 18);
INSERT INTO `States` VALUES(‘LA’, ‘Louisiana’, 19);
INSERT INTO `States` VALUES(‘ME’, ‘Maine’, 20);
INSERT INTO `States` VALUES(‘MD’, ‘Maryland’, 21);
INSERT INTO `States` VALUES(‘MA’, ‘Massachusetts’, 22);
INSERT INTO `States` VALUES(‘MI’, ‘Michigan’, 23);
INSERT INTO `States` VALUES(‘MN’, ‘Minnesota’, 24);
INSERT INTO `States` VALUES(‘MS’, ‘Mississippi’, 25);
INSERT INTO `States` VALUES(‘MO’, ‘Missouri’, 26);
INSERT INTO `States` VALUES(‘MT’, ‘Montana’, 27);
INSERT INTO `States` VALUES(‘NE’, ‘Nebraska’, 28);
INSERT INTO `States` VALUES(‘NV’, ‘Nevada’, 29);
INSERT INTO `States` VALUES(‘NH’, ‘New Hampshire’, 30);
INSERT INTO `States` VALUES(‘NJ’, ‘New Jersey’, 31);
INSERT INTO `States` VALUES(‘NM’, ‘New Mexico’, 32);
INSERT INTO `States` VALUES(‘NY’, ‘New York’, 33);
INSERT INTO `States` VALUES(‘NC’, ‘North Carolina’, 34);
INSERT INTO `States` VALUES(‘ND’, ‘North Dakota’, 35);
INSERT INTO `States` VALUES(‘OH’, ‘Ohio’, 36);
INSERT INTO `States` VALUES(‘OK’, ‘Oklahoma’, 37);
INSERT INTO `States` VALUES(‘OR’, ‘Oregon’, 38);
INSERT INTO `States` VALUES(‘PA’, ‘Pennsylvania’, 39);
INSERT INTO `States` VALUES(‘RI’, ‘Rhode Island’, 40);
INSERT INTO `States` VALUES(‘SC’, ‘South Carolina’, 41);
INSERT INTO `States` VALUES(‘SD’, ‘South Dakota’, 42);
INSERT INTO `States` VALUES(‘TN’, ‘Tennessee’, 43);
INSERT INTO `States` VALUES(‘TX’, ‘Texas’, 44);
INSERT INTO `States` VALUES(‘UT’, ‘Utah’, 45);
INSERT INTO `States` VALUES(‘VT’, ‘Vermont’, 46);
INSERT INTO `States` VALUES(‘VA’, ‘Virginia’, 47);
INSERT INTO `States` VALUES(‘WA’, ‘Washington’, 48);
INSERT INTO `States` VALUES(‘WV’, ‘West Virginia’, 49);
INSERT INTO `States` VALUES(‘WI’, ‘Wisconsin’, 50);
INSERT INTO `States` VALUES(‘WY’, ‘Wyoming’, 51);

 

Age Range:

CREATE TABLE `AgeRange` (
`ID` tinyint(4) NOT NULL,
`Range` varchar(128) NOT NULL,
PRIMARY KEY  (`ID`),
UNIQUE KEY `Range` (`Range`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;

INSERT INTO `AgeRange` VALUES(0, ‘Under 18′);
INSERT INTO `AgeRange` VALUES(1, ’18 to 24′);
INSERT INTO `AgeRange` VALUES(2, ’25 to 29′);
INSERT INTO `AgeRange` VALUES(3, ’30 to 34′);
INSERT INTO `AgeRange` VALUES(4, ’35 to 39′);
INSERT INTO `AgeRange` VALUES(5, ’40 to 49′);
INSERT INTO `AgeRange` VALUES(6, ’50 to 59′);
INSERT INTO `AgeRange` VALUES(7, ’60 to 69′);
INSERT INTO `AgeRange` VALUES(8, ’70 to 80’);
INSERT INTO `AgeRange` VALUES(9, ‘Over 80’);

Organizational Role

CREATE TABLE `Organizational_Role` (
`ID` tinyint(4) NOT NULL,
`Position` varchar(255) NOT NULL,
UNIQUE KEY `Position` (`Position`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;

INSERT INTO `Organizational_Role` VALUES(0, ‘Owner / Operator’);
INSERT INTO `Organizational_Role` VALUES(1, ‘General Management’);
INSERT INTO `Organizational_Role` VALUES(2, ‘Administrative Support’);
INSERT INTO `Organizational_Role` VALUES(3, ‘Marketing’);
INSERT INTO `Organizational_Role` VALUES(4, ‘Finance’);
INSERT INTO `Organizational_Role` VALUES(5, ‘Operations’);
INSERT INTO `Organizational_Role` VALUES(6, ‘Sales’);
INSERT INTO `Organizational_Role` VALUES(7, ‘IT’);
INSERT INTO `Organizational_Role` VALUES(8, ‘Legal / Regulator’);
INSERT INTO `Organizational_Role` VALUES(9, ‘Customer Service’);
INSERT INTO `Organizational_Role` VALUES(10, ‘Public relations / corporate communications’);
INSERT INTO `Organizational_Role` VALUES(11, ‘Accounting’);
INSERT INTO `Organizational_Role` VALUES(12, ‘Cross-functional role’);

Employment Level

CREATE TABLE `EmploymentLevel` (
`ID` tinyint(4) NOT NULL,
`Level` varchar(128) NOT NULL,
PRIMARY KEY  (`ID`),
UNIQUE KEY `Level` (`Level`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;

INSERT INTO `EmploymentLevel` VALUES(0, ‘CEO, president, owner, senior partner, principal’);
INSERT INTO `EmploymentLevel` VALUES(1, ‘CXO other than CEO (CFO, CIO, etc.)’);
INSERT INTO `EmploymentLevel` VALUES(2, ‘Partner / senior executive’);
INSERT INTO `EmploymentLevel` VALUES(3, ‘Middle Manager’);
INSERT INTO `EmploymentLevel` VALUES(4, ‘Supervisor’);
INSERT INTO `EmploymentLevel` VALUES(5, ‘Professional (consultant, legal, accountant, programmer, etc.)’);
INSERT INTO `EmploymentLevel` VALUES(6, ‘Front-line employee’);
INSERT INTO `EmploymentLevel` VALUES(7, ‘Adminstrative staff’);
INSERT INTO `EmploymentLevel` VALUES(8, ‘Student’);
INSERT INTO `EmploymentLevel` VALUES(9, ‘Not employed’);
INSERT INTO `EmploymentLevel` VALUES(10, ‘Retired’);

Number of Employees

CREATE TABLE `NumberOfEmployees` (
`ID` tinyint(4) NOT NULL,
`Range` varchar(96) NOT NULL,
PRIMARY KEY  (`ID`),
UNIQUE KEY `Range` (`Range`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;

INSERT INTO `NumberOfEmployees` VALUES(0, ‘1’);
INSERT INTO `NumberOfEmployees` VALUES(1, ‘2’);
INSERT INTO `NumberOfEmployees` VALUES(2, ‘3’);
INSERT INTO `NumberOfEmployees` VALUES(3, ‘4 to 9′);
INSERT INTO `NumberOfEmployees` VALUES(4, ’10 to 16′);
INSERT INTO `NumberOfEmployees` VALUES(5, ’17 to 29′);
INSERT INTO `NumberOfEmployees` VALUES(6, ’30 to 49′);
INSERT INTO `NumberOfEmployees` VALUES(7, ’50 to 99’);
INSERT INTO `NumberOfEmployees` VALUES(8, ‘100 to 299’);
INSERT INTO `NumberOfEmployees` VALUES(9, ‘300 to 599’);
INSERT INTO `NumberOfEmployees` VALUES(10, ‘600 to 999’);
INSERT INTO `NumberOfEmployees` VALUES(11, ‘1000 to 1499’);
INSERT INTO `NumberOfEmployees` VALUES(12, ‘1500 to 1999’);
INSERT INTO `NumberOfEmployees` VALUES(13, ‘2000 to 3000’);
INSERT INTO `NumberOfEmployees` VALUES(14, ‘more than 3000’);

Revenue Volume

CREATE TABLE `RevenueVolume` (
`ID` tinyint(4) NOT NULL,
`Volume` varchar(96) NOT NULL,
PRIMARY KEY  (`ID`),
UNIQUE KEY `Volume` (`Volume`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;

INSERT INTO `RevenueVolume` VALUES(0, ‘None’);
INSERT INTO `RevenueVolume` VALUES(1, ‘Less than $100,000’);
INSERT INTO `RevenueVolume` VALUES(2, ‘Less than $200,000’);
INSERT INTO `RevenueVolume` VALUES(3, ‘Less than $300,000’);
INSERT INTO `RevenueVolume` VALUES(4, ‘Less than $400,000’);
INSERT INTO `RevenueVolume` VALUES(5, ‘Less than $500,000’);
INSERT INTO `RevenueVolume` VALUES(6, ‘Less than $750,000’);
INSERT INTO `RevenueVolume` VALUES(7, ‘Less than $1,000,000’);
INSERT INTO `RevenueVolume` VALUES(8, ‘Less than $2,000,000’);
INSERT INTO `RevenueVolume` VALUES(9, ‘Less than $5,000,000’);
INSERT INTO `RevenueVolume` VALUES(10, ‘Less than ten million’);
INSERT INTO `RevenueVolume` VALUES(11, ‘Less than 50 million’);
INSERT INTO `RevenueVolume` VALUES(12, ‘Less than 100 million’);
INSERT INTO `RevenueVolume` VALUES(13, ‘Less than 250 million’);
INSERT INTO `RevenueVolume` VALUES(14, ‘Less than 500 million’);
INSERT INTO `RevenueVolume` VALUES(15, ‘Less than 750 million’);
INSERT INTO `RevenueVolume` VALUES(16, ‘Less than a billion’);
INSERT INTO `RevenueVolume` VALUES(17, ‘More than a billion’);

Time Frame

CREATE TABLE `TimeFrame` (
`ID` tinyint(4) NOT NULL,
`Time Frame` varchar(128) NOT NULL,
`Seconds` bigint(20) NOT NULL,
PRIMARY KEY  (`ID`)
) ENGINE=MyISAM DEFAULT CHARSET=utf8;

INSERT INTO `TimeFrame` VALUES(0, ‘One Day’, 86400);
INSERT INTO `TimeFrame` VALUES(1, ‘Three Days’, 259200);
INSERT INTO `TimeFrame` VALUES(2, ‘Five Days’, 432000);
INSERT INTO `TimeFrame` VALUES(3, ‘One Week’, 604800);
INSERT INTO `TimeFrame` VALUES(4, ‘Ten Days’, 864000);
INSERT INTO `TimeFrame` VALUES(5, ‘Two Weeks’, 1209600);
INSERT INTO `TimeFrame` VALUES(6, ‘Fifteen Days’, 1296000);
INSERT INTO `TimeFrame` VALUES(7, ‘Eighteen Days’, 1555200);
INSERT INTO `TimeFrame` VALUES(8, ‘Three Weeks’, 1814400);
INSERT INTO `TimeFrame` VALUES(9, ‘Twenty Five Days’, 2160000);
INSERT INTO `TimeFrame` VALUES(10, ‘Four Weeks’, 2419200);
INSERT INTO `TimeFrame` VALUES(11, ‘Thirty Days’, 2592000);
INSERT INTO `TimeFrame` VALUES(12, ‘Five Weeks’, 3024000);
INSERT INTO `TimeFrame` VALUES(13, ‘Six Weeks’, 3628800);
INSERT INTO `TimeFrame` VALUES(14, ‘Forty Five Days’, 3888000);
INSERT INTO `TimeFrame` VALUES(15, ‘Seven Weeks’, 4233600);
INSERT INTO `TimeFrame` VALUES(16, ‘Eight Weeks’, 4838400);
INSERT INTO `TimeFrame` VALUES(17, ‘Sixty Days’, 5184000);
INSERT INTO `TimeFrame` VALUES(18, ‘Nine Weeks’, 5443200);
INSERT INTO `TimeFrame` VALUES(19, ‘Ten Weeks’, 6048000);
INSERT INTO `TimeFrame` VALUES(20, ‘Eleven Weeks’, 6652800);
INSERT INTO `TimeFrame` VALUES(21, ‘Twelve Weeks’, 7257600);
INSERT INTO `TimeFrame` VALUES(22, ‘Ninety Days’, 7776000);
INSERT INTO `TimeFrame` VALUES(23, ‘Sixteen Weeks’, 9676800);
INSERT INTO `TimeFrame` VALUES(24, ‘Twenty Weeks’, 12096000);
INSERT INTO `TimeFrame` VALUES(25, ‘Twenty Four Weeks’, 14515200);
INSERT INTO `TimeFrame` VALUES(26, ‘Twenty Eight Weeks’, 16934400);
INSERT INTO `TimeFrame` VALUES(27, ‘Thirty Two Weeks’, 19353600);
INSERT INTO `TimeFrame` VALUES(28, ‘Thirty Six Weeks’, 21772800);
INSERT INTO `TimeFrame` VALUES(29, ‘Forty Weeks’, 24192000);
INSERT INTO `TimeFrame` VALUES(30, ‘Forty Four Weeks’, 26611200);
INSERT INTO `TimeFrame` VALUES(31, ‘Forty Eight Weeks’, 29030400);
INSERT INTO `TimeFrame` VALUES(32, ‘Fifty Two Weeks’, 31449600);
INSERT INTO `TimeFrame` VALUES(33, ‘One Year’, 31536000);
INSERT INTO `TimeFrame` VALUES(34, ‘One Year, Three Months’, 39420000);
INSERT INTO `TimeFrame` VALUES(35, ‘One Year, Six Months’, 47304000);
INSERT INTO `TimeFrame` VALUES(36, ‘One Year, Nine Months’, 55188000);
INSERT INTO `TimeFrame` VALUES(37, ‘Two Years’, 63072000);
INSERT INTO `TimeFrame` VALUES(38, ‘More Than Two Years’, 63072001);
INSERT INTO `TimeFrame` VALUES(39, ‘Unknown’, 0);
INSERT INTO `TimeFrame` VALUES(40, ‘Not time based’, 0);

I hope this saves you time.

– Joel