Free MDB file to MySQL Conversion

originally posted at
This week I created a pretty useful tool I’d like to share. I was given an MS Access file (.mdb) and I needed a way to transfer the structure and the data into a MySQL database. I don’t have MS Access and I have no plans to purchase it, so I did a lot of searching to find free tools for getting at the data.

I found a few ways that worked ok, two ways that did what I needed and one that I liked the best. The one I liked the best was MDB Viewer Plus, an exe file that doesn’t need to be installed – you can get the latest version at I got the 1.5.1 version from another site found through google, but later discovered that I couldn’t get into an mdb file that required a password using that version. Once I went to the source of the program I got the 1.5.3 version and the password authentication worked great. The other program that worked fairly well was called Open Office Base, but exporting the data proved a bit tricky, as I had to first create reports that queried the tables before I could save them as text. I liked MDB Viewer Plus better because exporting a table is very easy, you can do it by clicking a tab with the table name you want to view it, then click the “Export Table” button to export it in one of nine different formats.

So once I got the tables exported from the mdb file into text files, then I looked around for an easy way to get it into mysql. Unfortunately phpmyadmin requires more than just field delimiters to import csv files, but also some character has to “enclose” every field. I did some more searching and found a simple php script posted on the codewalkers site that converted a type of csv file into a bunch of sql insert queries. This looked promising, so I modified it to ignore empty fields, convert dates into valid mysql date format, take tab delimited text as the input instead of semicolon delimited fields (I don’t know where you would get that kind of csv file anyway), and I added a table creation query with a user definable field type. I had a table with over 100 fields in it that I didn’t want to type out, so this was the script I used to generate the table creation query as well as the insert statements.

Below you will see a screenshot of the script with a small example tab delimited text file that I pasted into the main input box:
Example usage of the conversion script
Download the zipped php file here

So the summarize, here are the steps to follow for this MDB file to MySQL transfer or migration:

  1. Use one of the free programs mentioned above like the MDB Viewer Plus to open the Access file and see what’s in it
  2. Save each table as a TXT file (which happens to be tab delimited when you use the MDB Viewer Plus)
  3. Download the php file I put together and shared above, and put it in the webroot of either a web hosting account you have on a remote server (like Blue Host, which I use for this site) or on your local webserver root. If you are interested in installing a free webserver easily you can try XAMPP, which I use on my PC.
  4. For each table you want to add to MySQL, type in the name and default field type (the type that most of your fields will be, so you will have to manually set as few of them as possible afterwards), and paste the tab delimited text of your table (with column names on the first line) into the big input box.
  5. Push the button at the bottom, and then use the first generated sql statement to create your table
  6. Change the field types that don’t match the default (like an autoincrement integer field for the index, or a few date fields)
  7. Then run the rest of the sql statements to populate your table and you’re done:)

Update: I just figured out how to do something that makes the whole process much easier for windows users. Using php you can connect to just about any database that exists, and query it directly. And for Access, an .mdb file counts as a database. So here’s the code to connect to a local mdb file using php on a windows machine:

$db_connection = new COM("ADODB.Connection", NULL, 1251);
$db_connstr = "DRIVER={Microsoft Access Driver (*.mdb)};" .
" DBQ=C:\Path to mdb file\MyAccessFileName.mdb; Password=MyPasswordHere; DefaultDir=C:\DataDir";
$rs = $db_connection->execute("SELECT * FROM TableName");
$rs_fld0 = $rs->Fields(0);
$rs_fld1 = $rs->Fields(1);
while (!$rs->EOF) { // loop through the result set from the mdb file
$field1Value    = $rs_fld0->value;
$field2Value    = $rs_fld1->value;
// Do something with the data here, like put it into mysql, put it into a file, or print it to the screen
echo "the first field value is $field1Value, the second field value is $field2Value<br>";

You can read more about it in the PHP manual at

Join the Conversation


  1. Robert I apologize for Calling you Chris. Robert, the notes your wrote above were a great relief for me. Thanks alot.

  2. Hi,

    I was looking for the best way to transfer data from large MDB file (~2GB) to MySQL. The reason was poor full-text search speed of MS Access obviously. Your script helped me a lot. I would not probably find a way to connect to MDB using a COM object.

    A few notes (please correct me if I am wrong or inaccurate):
    1) ‘DefaultDir=C:\DataDir’ in connection string is redundant,
    2) ‘$rs->Fields(0)’ could be specified as ‘$rs->Fields(“FieldName”)’ which is a clearer way of specifying the source field,
    3) if anyone is going to process such a file one should cancel time limit for running the script as it is likely going to run for more than 30sec.: ini_set(“max_execution_time”, 0); set_time_limit(0);
    4) when inserting data to MySQL they should be treated with mysql_real_escape_string,
    5) If records in MDB are huge (>1MB) you will probably face a problem when inserting them to MySQL. You can get away with: mysql_query(“SET GLOBAL max_allowed_packet=1073741824”); – which sets the allowed size of inserted packet to max.

    Again, many thanks for the script.

    1. Hello RP,
      I’m glad you found this useful to you. Your comments will also be helpful for others who are dealing with very large data files, thanks for sharing them. I’ve found that as the volume of data you work with increases, many of the easy ways of dealing with data (like through phpmyadmin, for example) have to be abandoned in favor of more robust methods – I wrote another article about importing and exporting large datasets which discusses this idea.

Leave a comment

Your email address will not be published. Required fields are marked *

CommentLuv badge