One of hats that I wear at QUADROtech is testing one of our top products, Archive Shuttle. In order to test it, and really in order to test many aspects of Enterprise Vault I often have the need to have repeatable testing.
Repeatable in terms of:
– Being able to export the same archived data to PST from a mailbox archive
– Being able to ingest the same data over and over to multiple archives in multiple vault stores, across multiple test systems
I’ve found so far that the easiest way to achieve this is to build up a mailbox how I want it to be in terms of:
– The folder structure (number of folders, and depth)
– The names of folders (simple folder names, long folder names, foreign character folder names)
– The items within each folder (lots of small items, a few very large items)
Once you’ve got the mailbox set up like that you can then archive it. In fact what I often do when setting up a particular mailbox archive and ingesting data from multiple locations, eg mail generation tools, is that I archive the mailbox every few hours, just to keep up-to-date.
Once the archiving is completed you then have your ‘single’ nice archive.
How do you get that to work on another system, or another archive, or another vault store?
Well.. the easiest way that I have found so far is to then export that archive to PST. With the changes in the Enterprise Vault 9 world you can now get sizeable PST files (eg more than 2 Gb). That PST once it’s been created can then be copied to other environments as needed, and it can be used to ingest from into other archives. Of course if it’s in the same vault store you’re likely to get Single Instance Storage coming in to play which means when you ingest it you don’t massively increase the footprint of partition-stored data. (This depends on your sharing settings).
In the end it’s quite a neat, simple way of making sure that the work you do, if it needs to be, can be easily repeated.