Dear data integration fans,
I’m a big fan of “appropriate” data modeling prior to doing any data integration work. For a number of folks out there that means the creation of an Enterprise Data Warehouse model in classical Bill Inmon style. Others prefer to use modern modeling techniques like Data Vault, created by Dan Linstedt. However, the largest group data warehouse architects use a technique called dimensional modeling championed by Ralph Kimball.
Using a modeling technique is very important since it brings structure to your data warehouse. The techniques used, when applied correctly of-course, are helping you in a big way to avoid all sorts of pitfalls in the design of a data warehouse.
From my own experience and from what I see in my own Kettle community, dimensional modeling is by far the most popular technique used to create data warehouses. For that reason (and the fact that I’m a huge fan of Kimball) I’ve always made sure to properly support the most complex part of technique: the slowly changing dimension. For the better part that has made Kettle an excellent choice when it comes to easy translation of your dimensional model to ETL.
So a few weeks ago I was doing some basic modeling for a new Pentaho logging data mart for PDI 4.3 EE. This data mart will be responsible for the delivery of easy to digest reports, analyses views and dashboards on the subjects of monitoring and logging of Pentaho servers. Initially I started doing this in a nice Eclipse plugin called UMLet which resulted in a data model like this:
While this result isn’t the worst diagram you can possibly imagine there are a number of problems with the approach:
- The information about dimensions, attributes, relationships, … is not captured in a structured way.
- Export of the metadata is not possible in any usable format except for PDF and images.
- UMLet, like so many UML and modeling tools is a generic tool that also supports many other features that I’m not interested in when I’m doing dimensional modeling. As a result, creating a model takes time and real effort.
- The work needs to be used in your favorite ETL tool so it makes sense to be have it handy there, instead of having to use a third party tool.
Wouldn’t it be great if I could create a new metadata domain to hold all the star models for a certain data mart?
Then wouldn’t it be great if you could edit your star models in there?
The graphics don’t have to be anything fancy, I thought. It just needs to automatically position the fact table in the middle and the dimensions around it…
Obviously, I would like to be able to edit the name, description and type of the dimension …
and depending on the type of dimension I would like to insert a bunch of default attributes…
Using standard Kettle data grid I should be able to copy attributes and other metadata back and forth between dimension dialogs and a spreadsheet as well.
In the fact table definition it would be cool if we could not only specify the facts but also the relationships to the dimension…
Because that way we wouldn’t have to worry about how to draw the star model and we would know everything we would need to know.
If we would have a tool like that we would be able to generate the SQL to generate the physical tables against a certain target database…
Because if we would have all sorts of knowledge in metadata of the dimensions we could really nicely generate all the required data types, indexes and what not.
And then it would be cool to also generate a template transformation to update the dimension and fact tables in the models…
Well, I thought it would be nice to have that sort of functionality.
Perhaps we could also create physical Pentaho metadata domain (XMI) from the star domain as well as Mondrian schemas and a PDF with documentation.
OK, so this is coming to a PDI release near you in the short term. I’ve only been working on it for a few weeks on and off but you can try an early version here. Simply unzip it in the plugins folder of a PDI 4.3 build. The plugin needs 4.3 since that version already includes a lot of libraries like Pentaho metadata and reporting and that way I don’t need to package all those libraries with it. We can see later how we can deploy on 4.2 as well.
Please provide feedback here or in PDI-6890.
Until next time,