Recently I had an interesting situation where I needed to load an account hierarchy daily from a single file. The tricky bit about this was that I couldn’t make any changes to the input source file as it came from a system that could not be changed in the timescales of my project.
My usual solution to this sort of data load would be a custom process built from Talend. However this time I could not change the load mechanism away from the Salesforce dataloader.
The approach I took was to implement an extract, load, transform process. I pointed the data load at a staging object that replicated the schema of the load file. Secondly I wrote a schedulable and batchable apex class to transform the staging to account records.
The transformation process had to work at three levels of grain aligned to each level in the account hierarchy. I used aggregate queries based on the top two layers to find the grandparents and parents that needed to be inserted or updated. Then the base level corresponding to each row in the staging table could be inserted or updated and at the same time linked to the correct parent.
I added some metadata to the staging records so an administrator could check any errors and fix them before using a custom button to reprocessing them.
Using apex code instead of the visual programming of Talend was a bit of a challenge . I had to translate the operations I wanted into apex code.
A future improvement to my solution would be to use a custom setting to hold the field mappings and key definitions. I had to hard code these in the apex class to meet my time constraints. I would have preferred to make it more easy to tweak the field mappings. And make it easier to reuse the code.