PHP Error Handling and Debugging – Part 1

This article describes the process of testing PHP code.  Using the tips that I will explain can help to decrease the code/test cycle time.

The first thing that you must know in order to plan your code/test process is the environment in which your code will run.

If you have full control of the system, less configuration is required. In this case you can rely on the default settings, and simply need to know where the logs are kept by default. On a typical LAMP (apache) system you can find the log files in /var/log/httpd. Check the documentation for the operating system that you use as some operating systems use a different directory (i.e. one version of Ubuntu uses /var/log/apache2). By default error messages from php will be kept in this directory.

If you are developing on a server where you don’t have access to the default logs, you can configure where your log messages are sent by putting a php.ini file containing the directive

error_log: path_to_log

in the root of the domain.

With this information in mind, we can begin to find code errors.

There are two error types to look for:

  1. parse
  2. runtime

A parse error is the first thing to look for when testing new or modified code. This can be something like a missing semicolon or other grammatical mistake. If a parse error occurs, it will be sent to PHP’s error_log. A simple way to find this kind of error is to load the file directly in a browser (i.e. an AJAX script that would not normally run in a browser could be tested this way for parse errors). With a default PHP installation the parse error will shown on the screen.

Most errors that are encountered are runtime errors. There are two kinds of runtime errors:

  1. exception
  2. functional

The first kind of runtime error happens when a statement or function call that is grammatically correct encounters a unexpected circumstance such as an invalid parameter (i.e. fopen(file_that_doesnt_exist,’r’)). This kind of error can only be seen during an actual run of the code with valid inputs. Opening the file in the browser directly usually will not find it as the inputs will not be those that would typically be encountered. For example opening an AJAX script that relys on the _POST variable for its input will typically not run many of the branches because of the missing _POST variables. To find this error, run the script as it would typically be run and check the error log for errors.

A functional runtime error is when the code runs, doesn’t generate an error, but doesn’t produce the expected outputs. To find this error use one or more of the following techniques:

  • echo/printf
  • error_log
  • try/catch

The simplest way to find errors is by adding echo statements to the code. This method can be somewhat tedious and slower to use than others, but a few well placed echo statements that use print_r to show the value of key return data structures can sometimes quickly illuminate the source of the malfunctioning code. The problem with this method is that because it outputs directly to stdout (the web browser) it is only available if the script can be run in the web browser directly with typical inputs. Many times this is not possible (i.e. for AJAX or cron code).

A more general way of debugging is to use the error_log function instead of echo. With the error_log function you can direct the messages to a file of your choosing with


or to the configured error_log mentioned earlier via


A bonus when using the error_log() function is that you also get a timestamp for each error logged.

If a runtime error is expected, a try/catch statement should be placed to ignore it or otherwise handle it in a way that doesn’t cause the script to stop abruptly.  This way the script will continue to run and an error will be logged.  This is better because you will know at what section of code the error occurred.  If the blocking error had gone uncaught (in the case of AJAX responder script errors), the calling application might have received a malformed response (parse error).  A try/catch statement is only helpful when a blocking exception occurs, and will not help to debug functional runtime errors.  The structure of this type of code testing is as follows:

try {
 //your new code
 } catch(Exception $E) {

In this article we have discussed simple code/test cycle techniques for PHP.  Tune in next time for part 2 where we will review using a debugger such as XDebug.

TCPDF php package for pdf writing

I recently had opportunity to implement the TCPDF package for a midsized project. This article attempts to document my experiences with the API, its strengths, weaknesses, and ease of use.

The package is quite simple to implement at a high level, and following the included examples I was able to create a writer for my project in a matter of days. I appreciated the flexibility of being able to use HTML for layout. Also appreciated was the ability to override the TCPDF class to create custom headers and footers. I utilized this to place a reference to the company logo in the website’s image directory rather than in the tcpdf package’s image directory. I also was able to create a more detailed header layout than the default using this method. Once the pdf document is constructed, TCPDF provides some helpful output options including posting the document directly to the browser. This is a nice option because it allows previewing in an iframe, and doesn’t take up space on the server.

Initially I constructed a string containing inline style and the data in one large HTML table, and wrote the pdf document using one writeHTML call. An example of this follows:

$style="<style type=\"text/css\">\n";
$style=" table {\n";
$style=" color:red;";
$style=" }\n";
$style=" td {\n";
$style=" border:none;";
$style=" }\n";
$table=" <tr>\n";
$table=" <td>example</td>\n";
$table=" </tr>\n";
$tcpdfObj=new TCPDF('L','pt',true,'letter',false);
$tcpdfObj->SetHeaderData("logo.png", 100, 'pdf title', 'header text');

This first implementation worked for a small test database, but failed when I tested it for larger ones producing out of memory errors. Raising php’s mem_limit didn’t solve the problem. I was able to work around this by dividing the writeHTML call into several smaller calls each with a copy of the inline style and a HTML table containing several rows of the original HTML table, but this added to the running time. writeHTML seemed to work with about 2500 cells at a time. Having overcome the memory limitation, I found that running time for large datasets was unnacceptable. It was in the range of 10 minutes or more for a 50000 cell document. Fortunately tcpdf has faster Cell, and MultiCell functions, however when using them layout becomes much more restrictive. Using these faster calls reduced the running time by 50%, but this was still too slow for my project.

To summarize, the tcpdf package works, offers some flexibility of layout and output, is quickly implemented, but doesn’t scale well.

PHP quick data export to spreadsheet

In our last post we talked about the limitations of creating large spreadsheets with PHP library PHPExcel.  Today we’ll discuss a quick workaround that will allow you to create large unformatted spreadsheets for quick and easy data export from your applications.  Here’s how we do a php quick data export to spreadsheet:

  • Properly escape your data.
  • Write data with fcsvput($filehandle, $yourdataarray, “t”);
  • Include table headings and other data.
  • Use a xls file extension.
Properly escape your data... You’ll be using a tab delimited format for output so you’ll need to replace the tabs with an escaped tab or five spaces so that your data doesn’t corrupt your output format.  For our purposes we’ll be using five spaces in the following function.
function TSV_Escape($str)
    if (!strlen($str) == 0)
        $str=str_replace('t','     ',$str);
        if (!strstr($str,"n") === false)
        return str_replace('"','""', $str);
    return '';

Write you data with fputcsv:  First you’ll need to open a file handle like this…

// Set file path and initialize handle.
$TSVFile = $_SERVER['DOCUMENT_ROOT'] . "/MyDataFile.xls";
$TSVFileHandle = fopen($TSVFile, 'w');
// Write your headers.. See next section :)
// Write your data.
// Write query data to csv.
if ($objdbconn->real_query($sql))
    if ($result = $objdbconn->store_result())
        echo date('H:i:s') . "Processing " . $result->num_rows . " records.<br>\n";
        while ($row = $result->fetch_assoc())
            $writeArr = array();
            foreach($row as $value)
                $writeArr[] = TSV_Escape($value);
            fputcsv($TSVFileHandle, $writeArr, "\t");
// Close your file handle.

Write your headers and other data:  You may want to include some information about your spreadsheet and you’ll certainly want to include the column header row.  Here’s a brief example of how this could be done:

$headings = str_replace('`', '', "Heading One,`Database Field 1`,`Database Field 2`,`Database Field 3`");
$headArr = explode(',', $headings);

$RepDat[]="Created: " . date('H:i:s d-m-Y');
$RepDat[]="Created by: " . $UserName;
// Write county name to csv file.

Use a xls file extension.  Sure you could use a tsv file extension, but if you want Excel or Open Office to open the file by default with minimal headache the xls extension will do the trick.  You’ll get an message when you open the report stating that the file was not in the same format as the extension, but you won’t have to worry about the tsv file extension being registered to the right application.

This method will kick out a tab delimited spreadsheet in a matter of seconds and can safely handle large record sets.  We used it for a while, as a temporary fix for reporting, until we came up with a better method and we still use this method to create database load files when parsing complex legacy reports or backing up database records to file.

I hope this php quick data export method is helpful.


Excel spreadsheets with PHP – a PHPExcel overview

Writing Excel spreadsheets with PHP using PHPExcel:  The pros and cons

PHPExcel is a php library for reading and writing excel spreadsheets.  It is very well documented and has good community support.  When searching the web for an open source solution for writing spreadsheets in PHP I reviewed everything from PHPClasses to Pear and finally I came across this package.  It’s the most up to date looking project out there for excel spreadsheets with PHP.  I didn’t want to use a project that was in the twilight of its development cycle so I opted to try PHPExcel.  Here’s a a brief overview of what I found:

Pros – The good points:

  • Very well documented.
  • Good examples
  • Support forum with active participation.
  • Supports many of the builtin excel functions.
  • Easy to style
  • Supports modern Excel formats plus csv, html, and pdf
  • Write spreadsheets
  • Read spreadsheets
Cons – what I couldn’t stand:
  • SLOW….  This library takes considerable time to produce a spreadsheet when memory conserving disk caching is used.  Consider running your process in the background.
  • Memory Intensive  PHPExcel offers disk caching to reduce its in memory size, but it basically is an in memory spreadsheet and must load completely into memory while being before being written to file.  This can considerably hinder your performance as the spreadsheet object and writer object can take over 25 mg of ram while the program is running with an additional 1 kb of ram per cell loaded in memory.  For an example of how this can impact your server or hosting service consider the load of a 50 column 10000 row spreadsheet which would take 525 mb of ram to load when writing to file.  While its possible to set the scripts memory high enough to handle this you’ll find that if several of these scripts try to run at the same time you’ll have a mess that could crash your server.
  • Write to file is batched processed at the end of the script and happens all at one time.  Regardless of your disk caching settings the entire spreadsheet is loaded into memory and then written to the file.  This creates a serious bottleneck if you have a large file and will oftentimes crash your script.  The only exception to this is when PHPExcel writes a csv file in which it allows for a more granular approach to memory to file writing.
Workarounds that will reduce memory usage and improve speed.
  • Set a cells value explicitly.  Setting the cells value and data type explicitly is much faster than writing cell values for PHPExcel to determine the data type of.
  • Disable formula precalculation by setting it to false.
  • Use disk caching.
  • Don’t try to use it to create reports from large data sets.
In summary:  If you are trying to read and write to small spreadsheets with less than two hundred and fifty thousand fields or cells then this library will be a good fit for your project.  It’s great for summary reporting!!  However if you’ve got extremely large record sets that you need to send to a spreadsheet you’ll need a different approach.  I’ll be writing about how we did that in the next blog post.
Stay tuned and may the source be with you!!
– Joel

My findings on large complex sql statements with mysqli

My findings when developing using large complex sql statements with mysqli in php

The granular approach is more efficient

Sometimes its tempting to execute a large complex sql statements with mysqli to “do it all” in one update but I’ve found this approach to be inefficient in terms of speed and functionality.  My advise it to use smaller and precise sql statements instead of large cumbersome queries. The small statements execute quickly and will get the job done faster then the large complex query. For instance in one case I had a complex dynamically generated query with over 230 update statements that updated a single table and concatenated (CONCAT) text to a text field.  This query typically hung up and took so long to execute that it caused problems in my ajax environment locking tables and creating conflicts with other operations that were trying to read and write using update and select statements.  Once I broke it up into 230 small queries the execution time went down from 30 seconds and failing to 3 to 5 seconds and never failing.  In another case when translating csv files to a database I had an issue with a multi query that hung up on large files but once broke it down into running smaller specific queries it was able to execute up to 10000 queries in a few seconds (no I’m not exaggerating it).

In addition to these tidbits here’s some other findings… Mysql lets you set the priority level of queries and by default update queries have a higher priority than select queries so in theory it isn’t possible to run a select on a table that is in the middle of an update transaction if autocommit is being used (it is always on by default). Mysqli locks the table when performing the update and the select has to wait until it is unlocked. This can become cumbersome and unweildy when performing large complex queries and so far I haven’t had very much success with them.

When it comes to complex sql statements with mysqli achieving granularity through smaller interrelated specific sql statements seems to get far greater results for speed and efficiency.

May the source be with you

– Joel

PHP and curl for remote site data extraction

Sometimes you’ll need to get data from another site on the internet.  In this article we’ll go over a quick and easy how to for using php and curl for remote site data extraction.

To get things started we’ll need to initialize a curl object and set some parameters so it can act as a web browser and log into the targeted remote site.

        // Initialize cURL
        $Curl_Obj = curl_init(); 

        // Enable Posting.
        curl_setopt($Curl_Obj, CURLOPT_POST, 1);

        // Enable Cookies
        curl_setopt ($Curl_Obj, CURLOPT_COOKIEJAR, 'cookie.txt'); 

        // Set the browser you will emulate
        $userAgent = 'Mozilla/5.0 (X11; Linux i686; rv:2.0.1) Gecko/20100101 Firefox/4.0.1';
        curl_setopt($Curl_Obj, CURLOPT_USERAGENT, $userAgent);

        // Don't include the header in the output.
        curl_setopt ($Curl_Obj, CURLOPT_HEADER, 0);

        // Allow referer field when following Location redirects.
        curl_setopt($Curl_Obj, CURLOPT_AUTOREFERER, TRUE);

        // Follow server redirects.
        curl_setopt($Curl_Obj, CURLOPT_FOLLOWLOCATION, 1);

        // Return output as string.
        curl_setopt ($Curl_Obj, CURLOPT_RETURNTRANSFER, 1);

You can find a complete list of php curl options at if you need more details.

Now that we’ve got our curl object set up lets post some login credentials to a login script.  To do this we’ll need to get the input ids from the login form and supply them values and setup the post with the curl object.  Then we’ll post the credentials to the script detailed in the action of the login form.  Here’s how we do it:

       // Set up post fields from login form.
       curl_setopt($Curl_Obj, CURLOPT_POSTFIELDS, "UserFieldId=UserName&$PasswordFieldId=$Password&AnotherFieldId=$AnotherValue");

       // Set the url to which the data will be posted.
       curl_setopt ($Curl_Obj, CURLOPT_URL, '');

       // Execute the post and get the output.
       $output = curl_exec ($Curl_Obj);

       // Empty the post fields so you don't re-post on the next request.
       curl_setopt($Curl_Obj, CURLOPT_POSTFIELDS, "");

At this point we’ve obtained the output of the login attempt and we can parse it to see if it was successful using php string manipulation functions.  For the sake of simplicity* I’ll leave that part to you.  Once you’ve insured that you’ve logged in you can proceed to navigate to the next url using the steps below:

       // Set curl object url option.
       curl_setopt ($Curl_Obj, CURLOPT_URL, $Get_This_Url);
       // Execute query and obtain content.
       $output = curl_exec($Curl_Obj);

Finally you’ve downloaded the content from the page that was not accessible prior to your virtual login.  You can parse it with php’s string functions or use xpath to read its content if you’d like.  Don’t forget to close the curl object before you exit your script!!!

// Close curl object.
curl_close ($Curl_Obj);


Happy coding and please use responsibly.

– Joel


* Here’s a simple string comparison function you can use when using php and curl for remote site data extraction.

function extract_snippet($html, $Start_Term, $End_Term, $PrintOutput = false)

        $start = strpos($html,$Start_Term);
        if (!$start === false)
            $end = strpos($html,$End_Term,$start);
            $T_Sel = $start + strlen($Start_Term);
            $snippet = substr($html,$T_Sel,$end-$T_Sel);
            if ($PrintOutput)
                print "\n<br>$snippet</br>";
            return $snippet;
        return false;

A quick php data sanitation guide

PHP data sanitation is the method of testing inputs and output for acceptable ranges of data to insure that a script will produce the desired result.  Data sanitation is sometimes referred to as a sanity check as insane things tend to happen when a script gets values it was never intended to process or renders values to the users browser that breaks the html document or javascript functionality of the page or worse yet cause the page to work in a manner it was never intended to.  PHP data sanitation for data input into scripts when the user is passing values in can be easily accomplished  in three steps and data sanitation for output to the users is even simpler.

Three steps for a php scripts data sanitation on input:

Step 1: Check for the existence of the required inputs from the four types of inputs you can use $_REQUEST[‘var_name’], $_POST[‘var_name’], $_GET[‘var_name’], and $_COOKIE[‘var_name’]. Use the function isset() to determine if the variable is set and doesn’t have a null value.  Use the function isempty() if you’d like to check to do the same type of test while accepting null values.  Tip don’t use $_REQUEST as you cannot validate where the data came from.

Step 2: Check inputs for approved data types. If the type doesn’t match reject the input value as tainted.  Functions for checking data types are as follows:


Step 3: Whitelist the input values by checking them for acceptable values and reject inputs that do not meet your specified input parameters.  Check for acceptable values using character type functions for strings and comparison operators for numeric values.   Ctype functions include:

  • ctype_alnum — Check for alphanumeric character(s)
  • ctype_alpha — Check for alphabetic character(s)
  • ctype_cntrl — Check for control character(s)
  • ctype_digit — Check for numeric character(s)
  • ctype_graph — Check for any printable character(s) except space
  • ctype_lower — Check for lowercase character(s)
  • ctype_print — Check for printable character(s)
  • ctype_punct — Check for any printable character which is not whitespace or an alphanumeric character
  • ctype_space — Check for whitespace character(s)
  • ctype_upper — Check for uppercase character(s)
  • ctype_xdigit — Check for character(s) representing a hexadecimal digit

Comparison operator include:

Example Name Result
$a == $b Equal TRUE if $a is equal to $b after type juggling.
$a === $b Identical TRUE if $a is equal to $b, and they are of the same type. (introduced in PHP 4)
$a != $b Not equal TRUE if $a is not equal to $b after type juggling.
$a <> $b Not equal TRUE if $a is not equal to $b after type juggling.
$a !== $b Not identical TRUE if $a is not equal to $b, or they are not of the same type. (introduced in PHP 4)
$a < $b Less than TRUE if $a is strictly less than $b.
$a > $b Greater than TRUE if $a is strictly greater than $b.
$a <= $b Less than or equal to TRUE if $a is less than or equal to $b.
$a >= $b Greater than or equal to TRUE if $a is greater than or equal to $b.

Pulling it all together with some pseudo code:  Encapsulate in function: IF isset(YourPostVar) AND isdigit(YourPostVar) AND YourPostVar >= 0 THEN return YourPostVar Else return false;  As you can see we combined all the stages here to insure that the numeric value we received was set, and was a digit with a value of zero or greater.

Note: When inputing data into databases use the dbms’s escape string functionality to insure that variables are properly escaped.  If the dbms doesn’t have an available escape string you can use addslashes().

Output sanitation is very simple. Always encapsulate output to the user with htmlentites() or htmlspecialchars() when sending data straight from inputs or a database.

Hope this helps.

Joel Caton

A simple ajax page update example using jquery

An ajax example using jquery

Ajax Loading Image

Download the latest version of jquery from and load it into your server. Create an html file or php script and output the html detailed below:
<!DOCTYPE html PUBLIC “-//W3C//DTD HTML 4.01 Transitional//EN”

<script type=”text/javascript”>
function AjaxUpdate()
/* setTimeout(“RssFeeder()”,90000); */
<title>Quick Ajax with jquery Demo</title>
<script type=”text/javascript” src=””></script>
<div id=”MyDiv”>
<img src=”” alt=”Loading content…” style=”display:block; margin-left:auto; margin-right:auto; vertical-align:middle;”/>
<!– Use this method for user event loading. –>
<button onclick=”AjaxUpdate();”>Update Page</button>

<script type=”text/javascript”>
// Use this for delayed loading
eval (‘setTimeout(“AjaxUpdate()”,30000)’);

Now if you’ll create a php script to send the information to page. The data you send to the page will go into the div id:Mydiv.

In this example when the user clicks the Update button the div contents will refresh. I’ve also included an example on the page that will update the div within 30 seconds of the page load. You can write javascript back to the page with the php script so you could cause it to load again if you wanted to.

Go to to get loading graphics for your ajax powered web applications. It’s free.

Hope this helps.

– Joel

A guide to PHP Data Sanitation – Sanity Checking practices to insure that your scripts are doing what you’ve designed them to do.

What is Data Sanitation / Sanity Checking?

Data sanitation / sanity checking is the process of filtering inputs for each script, function, and sql statement to insure that if is within a tolerable expected range that will cause the application to perform as planned. All inputs that originate from a foreign source must be treated with a degree of scepticism and filtered to insure that the values are not tainted. By performing data sanitation your scripts and web applications should be highly resistant to misuse or exploitation from the following types of attack.

  • SQL Injection Attacks
  • Cross Site Scripting
  • Form Spoofing
  • Cross Site Request Forgery
  • Session Fixation (Riding)

Blacklist and Whitelist filtering

The two common methodologies which can be used to filter input data are blacklisting and whitelisting. Blacklisting involves checking inputs for prohibited values and whitelisting is accomplished by checking for allowed values. A good example of a blacklist approach is a profanity filter. There will be times when you need to use blacklisting, but in general you will want to use whitelisting to insure that you receive values that are acceptable to the components of your application.

Client Side and Server Side filtering

We use client side filtering to make the application easy to use. By filtering the data on the client side you can insure that the user does not have to resubmit a form to make a correction. Client side filtering is susceptible to input tainting and must be used in conjunction with server side filtering. Here are some suggested client side filtering methods to use when passing data to the server from a html form.

Client Side:

  • Design html forms using selects to limit value selection to acceptable values where appropriate by using pre-populated selects.
  • Use check boxes for yes, no, undecided situations.
  • Limit the text field lengths to the maximum value the various components of your script will allow.
  • Pass a script generated token that you also stored in the session in a hidden field in the form to insure that the user came from your site.
  • In general when sending html output to the user from any script use htmlentities() to escape the data you are sending. This will keep values from your script from being used out of context and corrupting your intentions protecting you from mangled documents and in the worst case a cross site request attack.
  • Use javascript to validate the fields before the user is allowed to submit the data.

Once your script receives the data you will need to filter it again to insure that your data is not tainted. Here is where we use the whitelisting method mentioned earlier. The following methods can be used to construct data validation functions that you can store in a file and require in each script you write for ease of use.

Server Side:

  • Use functions such as isset and empty with conditional statements to insure that all required variables for your script or function are set. If they are not they you may provide an arbitrary value if suitable or exit the function or script.
  • Use built-in php functions such as the ctype or is_ family of functions with conditional statements to validate data types.
  • Use comparison operators (==, ===, !=, !==, <, <=, >, >=) to filter numerical values and string functions such as strlen, strcmp, strcasecmp, strpos, strstr, and strspn to filter string values.

Advanced methods to control your data inputs and insure their integrity:

  • Control the method by which your script receives data by limiting inputs directly to their source. Avoid using $_REQUEST as it allows you to directly pull a copy of the input from $_POST, $_GET, and $_COOKIE without specifying where it came from. By using using $_POST, $_GET, and $_COOKIE directly you force the input to come from the source you specify and block any efforts to cross these methods up and feed your application tainted data.
  • If you are receiving the inputs from a form insure that you dynamically generate the form and include a hidden token that is stored in the session and passed back via the form and validated. Dynamically generate the token using md5 or sha1 and a time-stamp concatenated  a hash string. Example: $_SESSION[‘Token’] = md5(time() . “YourHashString”). This will insure that you are receiving the data from your form and help protect you from form spoofing and cross site request forgery attacks. Also mentioned above in client side filtering.
  • Filter output intended for your database with your database driver specific escape_string function or use prepared statements. This protects against sql injection attacks.
  • Use session_regenerate_id to insure that the users session id is set by your script and not an outside source. Do this everytime your user changes authentication levels or your session accesses sensitive resources. This will protect against a session fixation attach. While this is not directly a sanitation topic, you do want to be sure that the data you are receiving is from your user and not a third party.


We’ve covered a lot of information in a short time. Feel free to bookmark this post and come back to it. Please comment if you feel I’ve skipped over anything or if you found this guide helpful. I’d like to publish this as a static page with examples in the near future.

– Joel

Passing PHP arrays between scripts using urls

Your quick and easy guide to passing php arrays via url

I spent more time then I care to mention on this problem and I’d like to save you some time and hassle.

You will run into situations where passing variables via the $_SESSION really doesn’t make much sense. In cases where you will be running multiple instances of the same script on the same site but passing it different data sets from the same user it seems a bit easier if you can directly pass the array via the url unless you want to get to monkeying around with multiple data sets in the session. Here’s a quick example:

Script1 located at needs to pass the following  array to script2. We do this by running the array through the serialize function and then running the product of that through the base64_encode function to package it. We then pass it to script2 and run it to the unserialize and base64_decode functions to unpackage it. We can also use this method to store arrays in a database and retrieve them.

An Example

Here’s some user preference example data generated by script1:

$UserArray[‘LikesCoffee’] = true; $UserArray[‘LikesDonuts’] = true; $UserArray[‘ListensToOpera’] = false; etc…

Now to pass this data to script2 we assemble the url:

$url = “” . base64_encode(serialize($UserArray));

You can use this url for any user event driven action such as a link, form submission, or ajax post. Once the data has been passed to script2

  1. Test for it: isset($_GET[‘UserArray’])
  2. unpack it: $UserArray = unserialize(base64_unencode($_GET[‘UserData’]));
  3. Sanitize it: Data passed in this manner is not really secure.
  4. This type of data packing is great for storing arrays in your database.


This method is quick and easy and can be used for routine data such as visitor preferences. Be forewarned that most servers and browsers have limits of the url length that they will accept and base_64 increases the size of your data by about 33%. Just keep it under 2000 characters and you’ll be ok. I wouldn’t suggest using this method for user authentication or credential passing without further encoding the data with one way encryption but that’s another topic.

Comments are appreciated.

– Joel