Back to basics

A few days ago someone made the comment that Pentaho Data Integration (Kettle) was a bit too hard to use. The person on the chat was someone that tried to load a text file into a database table and he was having a hard time doing just that.

So let’s go back to basics in this blog post and load a delimited text file into a MySQL table.

If you want to see how it’s done, click on this link to watch a real-time (non-edited) flash movie. It’s 11MB to download and is about 2-3 minutes long.

Load customers flash demo

Until next time!



  • Abhijit Mapgaonkar


    I tried the same thing and I am happy to see that PDI (Kettle) performance is at par with Commercial ETL tool like Informatica.

    Thanks for creating such a nice tool.

  • This transformation is typically I/O bound on the MySQL side. That is the big difference with previous versions.

    However, we’re working with the MySQL JDBC team to make this transformation run even faster. The new strategy works with the new “CSV Input” step for and lazy conversion for faster results in those cases where no character set conversions need to take place.

    Thanks for the kind remarks,


  • Hello, thanks for developing such amazing tool.
    I’m stuck with a migration of 293 tables from one database to another (oracle to postgres). I was wondering if there’s a way to tell kettle to take their names from a catalog, and the do something like:

    select * from ?

    and then create the table structure on the fly (without clicking the sql button) on the other database (because the tables dont exist yet) ? Kind of recreating table structures in batch mode. It would be awesome to achieve this in an automated fashion.
    Thanks a lot.