How to use Behat for BDD in Windows with Selenium in Docker

A few definitions before diving in (the title is a mouthful):

  • BDD = Behavior Driven Development, which is a more top down approach to Test Driven Development (TDD)
  • Behat = Tool for implementing BDD written in PHP
  • Selenium = open source browser testing tool, which can be used with behat as an addon through the Mink extension
  • Docker = open source virtual machine tool to emulate other operating systems
  • xampp = open source web hosting software for any (X) operating system containing Apache, Mysql (switched to MariaDB now, but at least it still starts with M), Perl, and Php

Nobody wants to be surprised by bugs after launch and manual testing is a painfully tedious process. Fully automated testing of web applications is like the holy grail. Unit testing can be useful, but ideally integration and even performance and user testing are automated as much as possible. I was excited to learn about BDD as a top down approach to test development as opposed to traditional TDD which is more of a bottom up approach.

I found behat while looking for a way to do BDD in php. I first tried following the behat quick start but I had to modify some steps. First, I had composer installed on my pc already (from using the windows installer) but I had to change step 1 from:

php composer.phar require --dev behat/behat  #maybe this works in linux but not on my pc


#make sure composer is in your path so this will find it - the windows installer does that for you
composer require --dev behat/behat  

Whenever I saw “php composer.phar” in the quick start for behat I replaced it with “composer” and it worked for me.

The basic quick start example got me started with behat, and I found the PHPUnit appendix on assertions to be helpful in order to create my own tests. But I wanted to control a web application, so I then reviewed the documentation about using Mink with Behat.

I was successful in getting the behat feature/scenario driven testing to work with the Goutte driver, but for some reason (actually several errors I could not get past easily) I was not able to get selenium to connect properly on my pc using the latest jar file for chrome or firefox webdrivers (and trying several suggestions on stack overflow for people with similar challenges). That means I could only test using the headless driver. I needed to use both browser and headless browser testing because there are always javascript events and actions which occur on the page, and what seemed to be the problem is I was using windows locally instead of linux.

While researching how to get the chrome driver to work with selenium I saw a mention of using docker to avoid environment and software version mismatch issues. That led me to an article explaining how to use docker with selenium. This article was very valuable for me (I just followed the windows instructions, ignoring the ubuntu steps), as it gave me a way to get past the confusing driver downloads and behat configuration to try and get them working with selenium in order to test a website with Chrome or Firefox.

In order to control the browser in the docker container (and watch it using TightVNC as mentioned in the docker article) I setup the following in the MinkExtension section of my behat.yml file (port 4445 is mapped to docker following the article instructions, and the ip is from the docker-machine ip command as mentioned in the article):

    wd_host: '' 

The next challenge was how to connect to localhost (I do my testing in xampp on my PC first because that’s where I develop) from within docker. Using ipconfig on my pc I was able to find the internal network IPV4 address for my PC on my wifi network (if you are hard wired look for the ethernet connection address). Then I updated this line in my behat.yml file to use it for testing within docker:

base_url: #this ip is from the wifi ipv4 address from ipconfig

My full behat.yml file then became this (I initially had ssl connection issues with goutte so I had to turn off the verification):

            browser_name: chrome  #I want to test with chrome for now, I'm using that docker image
#            base_url:  #this is an example of an external address to test behat with
#            base_url: http://localhost/dsdistribution/  #this doesn't work from within docker container
            base_url:  #this ip is from the wifi ipv4 address from ipconfig on my PC
                    verify: false
                wd_host: '' 

The dsdistribution folder is the webroot of the new application I was developing in my local environment.

My next challenge was to connect to the database on localhost from within docker. I first updated the host to be, but then my root with no password user did not have permission to connect. I found a recommendation for uncommenting a line in c:/xampp/mysql/bin/my.ini and setting it to allow remote connections:

# Change here for bind listening
bind-address='' #this allows any address to connect to mysql, only safe if network is internal

I just confirmed that my public IP does not have port 3306 open using a port forwarding tester at, this is standard for home networks on a PC but it’s good to check anyway.

Actually the new bind address alone did not fix my connection to the database on localhost from within docker (even after restarting apache, mysql, and even xampp itself). There were too many rules in the mysql.user table for the root user so for some reason it was denying access using the ‘remote’ docker host. So the next step was to create a new user using the mysql command line interface (after logging in as the admin user I named root) like this:

CREATE USER 'behat'@'%' IDENTIFIED BY 'behat';
GRANT ALL PRIVILEGES ON * . * TO 'behat'@'%';

I planned to use this new behat user for testing all my applications locally so I gave it all privileges on every database. Once I followed that last step I just had to update my db connection file in my web application to have a clause like this:

if ($_SERVER['HTTP_HOST']==''){
    $servername = '';
    $username = 'behat';
    $password = 'behat';

Then I was able to open TightVNC (as directed in the docker article) and watch my behat test scenarios execute after running a command like this in a dos prompt (running from where I installed behat, pointing to a specific feature file to test):

c:\xampp\htdocs\behat>vendor\bin\behat features\productType.feature

I was doing this with @javascript added above my rather large testing scenario (which logs in and interacts with a form), I don’t think goutte will work within docker (at least not with any image I see in the list at, so I’ll need to change the base_url parameter back to localhost for the headless testing to work again. I don’t know if I can set the base_url within a driver section yet. I will update this post when I figure that out, but please leave a comment if you know already.

How to Add a Form in a WordPress Post

This is a simple step by step guide to creating a form that will capture data and store it in a mysql database on your server from within a wordpress post

Note: the below example is a live, working form on this page

Step 1:

Create a simple form for a petition, contact request, or registration:


Here is the html code for it:

<tr><th>Name:</th><td><input id="userName" /></td></tr>
<tr><th>Address:</th><td> <input id="userAddress" /></td></tr>
<tr><th>City:</th><td> <input id="userCity" /></td></tr>
<tr><th>State:</th><td> <input id="userState> /></td></tr>
<tr><th>Zip:</th><td> <input id="userZip" /></td></tr>
<tr><th>Email:</th><td> <input id="userEmail" /></td></tr>
<tr><th>Comment:</th><td> <textarea id="userComment" ></textarea></td></tr>
<tr><th><td colspan=2><button type="button" onclick="saveUser();">Save</button></td></tr>

Step 2:

Create a javascript function to process the form and send it to the server. The following is a javascript function added to the page using the CSS & Javascript Toolbox plugin (it uses jquery because wordpress already has that available):

//Process the form and save the data record
function saveUser() {
  //First gather the form parameters and make sure name and email at least are populated
  var name = jQuery("#userName").val();
  var address = jQuery("#userAddress").val();
  var city = jQuery("#userCity").val();
  var state = jQuery("#userState").val();
  var zip = jQuery("#userZip").val();
  var email = jQuery("#userEmail").val();
  var comment = jQuery("#userComment").val();
  if (name.length<1 || email.length<1) {
    alert("Please at least enter your name and email.");
    return false;
  } else {
    //Now send the data to a server side function to really validate it and save it.
      type: "POST",
      url: "/ajax/saveUser.php",
      data: { name:name,address:address,city:city,state:state,zip:zip,email:email,comment:comment }
    }).done(function( results ) {
      if(results.length<1){ // network error
        alert("There was a network error, please try again or contact support and tell them what you are trying to do.");
      } else { // this is a successful interaction
        var resultObj = jQuery.parseJSON(results);
        if (resultObj.errorMsg.length>0) {  
        } else {
    	  //Record save successful
          alert("Thanks for your information, it was saved successfully!");
          //Show the user what they have entered:

Step 3:

Make sure you have a database table ready to store the information. Below is a simple table used to store the info in this example:

  `userName` varchar(80) DEFAULT NULL,
  `address` varchar(100) DEFAULT NULL,
  `city` varchar(80) DEFAULT NULL,
  `state` varchar(40) DEFAULT NULL,
  `zip` varchar(5) DEFAULT NULL,
  `email` varchar(80) DEFAULT NULL,
  `comment` text DEFAULT NULL,
  PRIMARY KEY (`userID`),
  KEY `userIP` (`userIP`)

Step 4:

Create the server side code to accept the form data, really validate it, and put it in the database or send back an error. Here is the contents of a server side code which will do all that for the example data above. Since this form is “in the wild”, I’m capturing the IP address because I will only show results to people that they entered themselves:

 * File: saveUser.php
 * Description: This file takes the passed in user information and validates it before
 * 	saving it to the database and returning content to show on the page. 
 * Inputs (in POST): 
 * 	userName, address, city, state, zip, email, comment
 * Outputs:
 * 	either an error or nothing
include("db.php"); // define your database credentials in another file

#connect to the database
$db = mysqli_connect($servername, $username, $password, $dbname);
mysqli_select_db($db,$dbname) or die('Could not select database');

#Get the passed in parameters into local variables, escaped for database input
$userName = empty($_POST['userName'])?'':addslashes($_POST['userName']);
$address = empty($_POST['address'])?'':addslashes($_POST['address']);
$city = empty($_POST['city'])?'':addslashes($_POST['city']);
$state = empty($_POST['state'])?'':addslashes($_POST['state']);
//only accept 5 numbers for zip
$zip = empty($_POST['zip'])?'':substr(preg_replace('/\D/','',$_POST['zip']),0,5);
$email = empty($_POST['email'])?'':addslashes($_POST['email']);
$comment = empty($_POST['comment'])?'':addslashes($_POST['comment']);

#This is an array used for gathering all the outputs of this file
$jsonObj = array();
$jsonObj['errorMsg'] = "";

#Validate inputs
if (empty($userName) or empty($email)) {
	$jsonObj['errorMsg'] = "Please at least enter your name and email.";
} else if (strpos($email,'@')===false) {
	//there are many more validations that can be made for emails but this is a start
	$jsonObj['errorMsg'] = "Please enter a valid email.";

#Enter the data record
if (empty($jsonObj['errorMsg'])) {	
	$sql = "insert into user_info (userName,address,city,state,zip,email,comment,userIP) 
	if ($debug) echo "about to run $sql ";
	mysqli_query($db, $sql); $err = mysqli_error($db); if (!empty($err)) $jsonObj['errorMsg'] = $err;

#Now get the list of data entered by this IP, without any script tags to prevent XSS:
if (empty($jsonObj['errorMsg'])) {	
	$sql = "select userName,city,state from user_info where userIP='".$userIP."'";
	$rs = mysqli_query($db, $sql); $err = mysqli_error($db); if (!empty($err)) $jsonObj['errorMsg'] = $err;
	while($row = mysqli_fetch_assoc($rs)) {
		$jsonObj['userList'].= "<tr><td>".strip_tags($row['userName'])."</td><td>".strip_tags($row['city']).

#Now send back the data in json format
echo json_encode($jsonObj);

Step 5:

Do something with the results! You could export them, feed them into another system using an API, send them in an email to someone, or just display them like so:

Name City State
You have not entered any user records yet, try it out!

Google Maps API Version 3 Geocoding Example

Today I realized that my location markers were not being displayed in all the google maps implementations I had put into place online, and after digging into it (trying to figure out what a code 610 response meant) I realized that I was using version 2 of the google maps api for retrieving latitude and longitude based on a given address (ie “geocoding”).

My actual maps were being displayed using version 3 of the API but the example I found a couple years ago and followed for geocoding used version 2, which was deprecated and will no longer be supported sometime in 2013 (I’ve heard both March and September, but since my code isn’t working in April I suppose the March date was more accurate).

Since I wasn’t able to find an example for just what I needed (getting the latitude and longitude based on an address, so I could show markers on a google map) I read through the docs and updated my code.  See below for my implementation (I use this as an included file wherever I need to pull latitude/longitude, it could be put into a function if you want). The address, city, state, and zip should be stored in the $address, $city, $state, and $zip params, and I am also saving the latitude and longitude in both a local database and a markers array to be used in a google map later:

 * File: incGetGeoCodes.php
 * Description: this file pulls the lat/long coordinates using Version 3 of the google maps api
 * 	The key parameter caused it to fail so I removed that and it works, but we may be under a 
 * 	lower daily usage limit because of that
// Initialize delay in geocode speed
$delay = 0;

$geoaddress = $address . ", " . $city . ", " . $state . " " . $zip;
$request_url = $base_url . "&amp;address=" . urlencode($geoaddress);

$status = $resultArr['status'];
if (strcmp($status, "OK") == 0) {
	// Successful geocode
	$markers[$shop_id]['lat'] = $resultArr['results'][0]['geometry']['location']['lat'];
	$markers[$shop_id]['long'] = $resultArr['results'][0]['geometry']['location']['lng'];
	$markers[$shop_id]['html'] = $name."<br>".$address."<br>".$city." ".$state." ".$zip."<br>".$phone;
	#Now update the db so we don't have to pull this again
	$query = "update entities ".
		"set shop_latitude=".addslashes($markers[$shop_id]['lat']).", ".
		"shop_longitude=".addslashes($markers[$shop_id]['long'])." ".
		"where EntityID=".intval($EntityID);
	mysql_query($query); $err=mysql_error(); if (!empty($err)) echo "query:$query, error: ".$err."<br>";
} else if (strcmp($status, "620") == 0) {
	// sent geocodes too fast
	$delay += 100000;
} else {
	// failure to geocode
	$error .= urlencode("Address " . $geoaddress . " failed to be geocoded. ");
	$error .= urlencode("Received status " . $status . "%0D%0A");

I was surprised to see no mention of an API key parameter to be used, if someone knows if that is an option (to increase the daily quota of geocoding api calls that can be made) please leave a comment and let me know. I’m just happy to get it working again for now. 🙂

How to Easily Prepare Database Import Using Perl

I’m revisiting this topic because in my last post about translating Excel data into a text file for database import I used a php script and yet still had some manual steps (the one that bugged me most was adding custom line termination characters for multiline data). I mentioned the idea that if I wanted to put more time in I could use a Perl script to do the whole process. Well my next set of data I needed to import was almost 700 records spanning almost ten times as many lines in the tab delimited text file I exported from Excel (due to the description fields having newline characters). So here’s my latest solution:

  1. First I added a numeric column in the left-most position (within excel) and filled it with incremental integers.
  2. Then I exported the spreadsheet as tab delimited text
  3. Then I wrote a perl script to loop through the file and add line termination characters before each new record, this time I had to use double equal signs (==) because the data contained single equal signs
  4. I also found a perl module similar to the php htmlentities() function which actually worked for me, as opposed to the php function which I abandoned in favor of hard-coded string replacement commands
  5. Then I ran the perl script on my tab delimited text file to create a tab delimited file with double equal signs as the line terminator and all the funny characters encoded so they would show up correctly in a browser
  6. Finally I imported my file using the load data infile option in the phpmyadmin interface, specifying the delimiter and line termination as \t and == respectively.
  7. Then I looked at a webpage which pulled that data and it looked like it did in the excel file!

Here is the code for the perl script I created:

# Filename: - this script will take a list of data 
#	including multiline data where the first field is a line count.
#	It will add == before each line count to enable mysql importing and 
#	will encode html entities and convert newline characters to <br>
use HTML::Entities;

my $input = $ARGV[0]; #this is the file to parse
my $output = $ARGV[1]; #this is the file to create
my $lineTerminator = "==";
my $count=1;
my $line = '';
open (IN,$input) or die "can't open file $input\n";  
open (OUT, ">$output") or die "Can't open outfile : $output\n";
while (<IN>) {	# each line of the current input file
	$line .= "<br>"; # include this here so the <br> will be encoded too
	$line = encode_entities($_); #This is the function which encodes html symbols
	# Check if this is the beginning of a new record.  If so then add the line terminator
	if ($line =~ /^$count/) {
		$line = $lineTerminator.$line;
	print OUT $line;
close IN;
close OUT;
print "there were $count records processed.\n";

The syntax for running the script (assuming perl is installed on your system – I use activeperl on my PC which is free):

perl inputFileName.txt outputFileName.txt

How to Export One Table Structure in Mysql and Infobright

I have a database with tables in it which have over 300 fields each. I wanted to create a new table like the others but I didn’t have the table structure saved in a file. Due to the number of fields it was impractical to copy the screen after executing a “show create table tablename” command in that database, so I had to find another way.

I found that I could use the mysqldump command to export just the table structure, and this command would probably have worked if my database engine was regular mysql:

# /usr/local/infobright/bin/mysqldump -d -u adminUserName -p'adminPassword' dbName tableName > file.sql
mysqldump: Got error: 1031: Table storage engine for 'BRIGHTHOUSE' doesn't have this option when doing LOCK TABLES

So I looked up how to avoid the lock tables error and found the single transaction flag which made it work in Infobright (which is based on mysql but has some differences):

# /usr/local/infobright/bin/mysqldump -d -u adminUserName -p'adminPassword' --single-transaction dbName tableName > file.sql

How to Convert Excel Spreadsheets to SQL Statements

Last week I told a client he could send me a file in excel and I could import that into the mysql database directly. When I got the file I learned that it wasn’t quite that easy. I had planned to just save it as tab delimited text or CSV, but there was a “description” field which was copied from various webpages (by copying the rendered text in the browser window).

Within the description field there were trademark symbols, copyright symbols, slanted quotes, triple dots, non-standard dash marks, bullets, and even a cross symbol (or dagger) used for footnotes. In addition, there were several paragraphs (including multiple line breaks) in each description field.

The first challenge to overcome was getting the database to handle line breaks properly so I could import the data. I figured if I added a special character before each new record then I could cause the database to ignore the newline characters, but when I tried pipe (|) and three pipes (|||) neither of those worked (I only got one record imported each time). It finally worked for me when I added an equal sign (=) before each new record (after checking to make sure there were no equal signs in the text – if there had been it should have worked by escaping them with a backslash and telling mysql that is the escape character).

In order to insert the equal signs I first saved the excel sheet as tab delimited text, then opened it in notepad++ and used their regular expression search and replace tool to do the insertions quickly. Since I had numbers in the first field of my data, I searched for ^(\d) and replaced it with =\1. If I wanted to spend more time on it I could have made a perl script to go straight from the .txt file to sql statements, but I already had a php file to create sql statements so I wanted to use that.

Once I had the equal signs in place, I used a load data infile command (actually created by phpmyadmin, I just had to make the selections – the local keyword was necessary for permission requirements) like this:


I was happy to get the lines imported, but the line breaks were lost and special characters showed up as question marks on the webpage where I displayed the data. So then I went back to my php script used for going from access to mysql from a few years ago and made a few modifications. I first tried using the php htmlentities() command to convert the symbols back to html code but for some reason that didn’t work for me. So through an iterative process I added replace commands for each symbol I found. I wasn’t trying to create all the symbols but to preserve the essence of the text. If you want to convert all the symbols you find you can look at a w2schools reference symbol page for a list – there is also an entity list at w3schools, if you don’t find your symbol in the first list. I also wanted to preserve the newlines so I used the php command nl2br and that worked great. Here is the final code I added for processing each field:

$data_value = trim($data_value);
$data_value = trim($data_value,'"');
if (empty($data_value)) continue;
$data_value = str_replace('™','&amp;amp;trade;',$data_value);
$data_value = str_replace('®','&amp;amp;reg;',$data_value);
$data_value = str_replace('’','\'',$data_value);
$data_value = str_replace('…','...',$data_value);
$data_value = str_replace('•','-',$data_value);
$data_value = str_replace('–','-',$data_value);
$data_value = str_replace('†','*',$data_value);
$data_value = str_replace('“','"',$data_value);
$data_value = str_replace('”','"',$data_value);
$data_value = str_replace('—','-',$data_value);
$data_value = nl2br($data_value);

I also couldn’t just copy the sql queries from the webpage because all the html code was converted back to symbols, so I added this line to write the queries directly to a local file:


Then I could open the queries.txt file and have a bunch of sql queries which I could run to populate my database table in mysql. My updated text to sql php script can be downloaded and used freely.

Update: I put the time into creating a perl script to make this easier, so take a look at that if the above process is too much work for you. 🙂

How to add special shipping by product in Zen Cart

originally posted at
I am in the process of migrating a static site ( to a database driven site “powered by” zen cart. One of the features requested was the ability to add special shipping charges for odd shaped items, like posters.

After doing some research I discovered a page with instructions at, but it took me awhile to figure out how to implement them for the specific situation I was targeting. After reading the instructions a few times I realized that I had to really create an extra read-only text attribute to make my “special shipping” attribute be selected (as a single radio button) by default. Here’s what it looks like in the admin panel:

zencart attribute controller legend
zencart attribute controller, showing the options to add a required attribute to a product

And here’s what it looks like on the product page (the radio button is selected by default now):

zencart shipping example

Though I just realized that the order is the opposite of what I want, even though I specifically set the order as noted above. Does anyone know why I can’t set the order in this way?

Update: I found a thread which helped me get the attributes sorted at The specific change I made which worked was to set the sort order for the attribute names, under Catalog -> Option Name Manager (they were all set to sort order 0 on that page before).

How to import and export large datasets in mysql

originally posted at
I’ve seen many questions around the importing and exporting of large tables and/or databases in mysql, so I’d like to share how I’ve learned to manage those processes.  Using phpmyadmin for these tasks is not suitable because web servers are not designed for running large scripts or uploading/downloading large files.  You can changes the limits in phpmyadmin but there are several different limitations in that environment which you may run into.

For exporting large databases you can use the mysqldump command from a prompt (unix shell or dos prompt).  For exporting large tables, I login to mysql and use a query with the syntax “SELECT column1name, column2name, column3name, … FROM myDatabaseName.myTableName INTO OUTFILE ‘/path/to/outfilename.csv’ FIELDS TERMINATED BY ‘,’ ENCLOSED BY ‘\” ESCAPED BY ‘\\’;” – this gives you a csv file in the location you specify with the fields surrounded by single quotes, separated by commas, and escaped by a backslash.  The only thing that needs to be escaped in this context is a single quote.

For importing large databases created from mysqldump or phpmyadmin, I save the sql dump file into a local directory, cd to that directory (in unix shell or windows dos), login to mysql from the command line, then type the following command: source dbdump.sql (or whatever the dump file name is with your db data).

For importing large tables that were exported using the export syntax I mentioned above, I login to mysql and then type something like “LOAD DATA INFILE ‘/path/to/outfile.csv’ INTO TABLE myDatabaseName.myTableName FIELDS TERMINATED BY ‘,’ ENCLOSED BY ‘\” ESCAPED BY ‘\\’;”

Free MDB file to MySQL Conversion

originally posted at
This week I created a pretty useful tool I’d like to share. I was given an MS Access file (.mdb) and I needed a way to transfer the structure and the data into a MySQL database. I don’t have MS Access and I have no plans to purchase it, so I did a lot of searching to find free tools for getting at the data.

I found a few ways that worked ok, two ways that did what I needed and one that I liked the best. The one I liked the best was MDB Viewer Plus, an exe file that doesn’t need to be installed – you can get the latest version at I got the 1.5.1 version from another site found through google, but later discovered that I couldn’t get into an mdb file that required a password using that version. Once I went to the source of the program I got the 1.5.3 version and the password authentication worked great. The other program that worked fairly well was called Open Office Base, but exporting the data proved a bit tricky, as I had to first create reports that queried the tables before I could save them as text. I liked MDB Viewer Plus better because exporting a table is very easy, you can do it by clicking a tab with the table name you want to view it, then click the “Export Table” button to export it in one of nine different formats.

So once I got the tables exported from the mdb file into text files, then I looked around for an easy way to get it into mysql. Unfortunately phpmyadmin requires more than just field delimiters to import csv files, but also some character has to “enclose” every field. I did some more searching and found a simple php script posted on the codewalkers site that converted a type of csv file into a bunch of sql insert queries. This looked promising, so I modified it to ignore empty fields, convert dates into valid mysql date format, take tab delimited text as the input instead of semicolon delimited fields (I don’t know where you would get that kind of csv file anyway), and I added a table creation query with a user definable field type. I had a table with over 100 fields in it that I didn’t want to type out, so this was the script I used to generate the table creation query as well as the insert statements.

Below you will see a screenshot of the script with a small example tab delimited text file that I pasted into the main input box:
Example usage of the conversion script
Download the zipped php file here

So the summarize, here are the steps to follow for this MDB file to MySQL transfer or migration:

  1. Use one of the free programs mentioned above like the MDB Viewer Plus to open the Access file and see what’s in it
  2. Save each table as a TXT file (which happens to be tab delimited when you use the MDB Viewer Plus)
  3. Download the php file I put together and shared above, and put it in the webroot of either a web hosting account you have on a remote server (like Blue Host, which I use for this site) or on your local webserver root. If you are interested in installing a free webserver easily you can try XAMPP, which I use on my PC.
  4. For each table you want to add to MySQL, type in the name and default field type (the type that most of your fields will be, so you will have to manually set as few of them as possible afterwards), and paste the tab delimited text of your table (with column names on the first line) into the big input box.
  5. Push the button at the bottom, and then use the first generated sql statement to create your table
  6. Change the field types that don’t match the default (like an autoincrement integer field for the index, or a few date fields)
  7. Then run the rest of the sql statements to populate your table and you’re done:)

Update: I just figured out how to do something that makes the whole process much easier for windows users. Using php you can connect to just about any database that exists, and query it directly. And for Access, an .mdb file counts as a database. So here’s the code to connect to a local mdb file using php on a windows machine:

$db_connection = new COM(&amp;quot;ADODB.Connection&amp;quot;, NULL, 1251);
$db_connstr = &amp;quot;DRIVER={Microsoft Access Driver (*.mdb)};&amp;quot; .
&amp;quot; DBQ=C:\Path to mdb file\MyAccessFileName.mdb; Password=MyPasswordHere; DefaultDir=C:\DataDir&amp;quot;;
$rs = $db_connection-&amp;gt;execute(&amp;quot;SELECT * FROM TableName&amp;quot;);
$rs_fld0 = $rs-&amp;gt;Fields(0);
$rs_fld1 = $rs-&amp;gt;Fields(1);
while (!$rs-&amp;gt;EOF) { // loop through the result set from the mdb file
$field1Value    = $rs_fld0-&amp;gt;value;
$field2Value    = $rs_fld1-&amp;gt;value;
// Do something with the data here, like put it into mysql, put it into a file, or print it to the screen
echo &amp;quot;the first field value is $field1Value, the second field value is $field2Value&amp;lt;br&amp;gt;&amp;quot;;

You can read more about it in the PHP manual at