Data lasts longer than applications, but will your next system upgrade reuse existing databases? If not, do you have exit strategies?
I have the last years worked with re-building existing mainframe systems on java platform with oracle backend. The migration often involved developing a mainframe program to export all data to files and java programs to import them to our datamodel in oracle. Luckily we had a lot of mainframe competence and I think the export programs usually took shorter time to realize than our import programs. Anyway, what happens in 10-15 years? When new upgrades and migrations is necessary?
Relational databases is considered safe and often an argument when it comes to these questions. Competence on relational databases and oracle will still be here in 10-15 years, but is that enough? Often your datamodel needs to be understood on a higher level than the database, some datamodels is easy to understand from the db-schema some is not. Although it is possible to have this as a strategy I will chose a better one:
Implement a data exit strategy from the start of the project. Take the time to make a dump feature that dumps the data to file in a well defined format, and a import feature that can load data in. Of cource the predecessor system developers has to understand the new format, and one can argue that this is no better than the database, I think it’s easier.
This also will give you a few other positive effect:
- It could be your backup/restore solution
- It could help you with testing. It will be easy to set up a production copy of you system with data
- It could help you with partitioning for availability
- It could help with database lock-in
A popular pattern for developers today is to use a repository abstraction for the database. I think a good idea will be to let the repository have the export and import feature.