TSQL Tuesday #98
This month's T-SQL Tuesday is being hosted by Arun Sirpal (blog) - Your Technical Challenges Conquered. I love how Arun wants to jump in at the start of the new year and talk about something we have conquered... it really sets the tone for the year!!
While this isn't the most challenging thing I have ever done, it was (and still is) something I am proud of. For my post, I am writing about a scenario that stretched my boundaries on planning things out well in order to come up with a robust solution.
My company provides visits to patients in their home, hospice care, or long term care in an acute care setting. We have multiple facilities throughout the USA. The vendor of one of our EMR systems creates a separate database for each facility. Therefore, the vendor currently has 40+ databases for our company. For reporting and analysis needs, we needed to bring a copy of the data into our data center. Of course when the business asks for something, they want it right away. The easiest and quickest way was to set up log shipping. Discovery led us to realize that the end result would be used by the etl team for importing into the data warehouse, the reporting team for ad-hoc reporting, and the application development for various custom built in-house applications. Therefore, on top of the log shipping, I also needed to make things easy for them to pull the data in a consistent way, without them having to keep up with new agencies and change connection strings, etc.
When thinking about the project as a whole, I knew I had the following requirements:
- agencies would be added as we transition them from another software to this EMR
- agencies would be added as we acquire new companies
- vendor folder structure which held the backup files was a root folder and every database had its own sub-folder
- handle issues that arise for one or multiple databases without interfering with the other database restore processes
- report on the "up-to-date" date for each database when users ask (there will always be someone who doubts the data is up to date because they dont see something they expected)
- combine/aggregate all data into a single database so that data can be queried in a consistent manner
- encryption process needs to be included in order to have the files encrypted in transit (even though using SFTP) and at rest
After planning, then polishing, then throwing that away, then planning some more, then polishing some more, etc., my end result is something that flows fairly well. I have a table which lists all of the databases, the folder where the log backups are, a flag to indicate "PerformLogBackups" and a flag to indicate "AggregateData". I have a decent sized script which queries a list of all of the databases which need to be updated, copies the encrypted file off the SFTP server onto a file share (just in case...), and then moves the file to the SQL Server local drive. Then, it decrypts all of the files, and loops through each database and restores each file in order. Once all of the restores are completed, another job runs which aggregates the data into a single database for reporting needs. Lastly, it cleans up the files and sends appropriate notifications.
The table which holds the list of databases solves the requirement of new agencies added easily. It also solves the scenario if issues arise with some, not all databases, by toggling the flags appropriately. If a certain database is having an issue, toggling the flag will stop it from being restored, but others will continue as normal. Easy peezy... lemon squeezy...
I have checkpoints in the process, which will be covered as a topic in an upcoming post, and presentation. Also, I still need to do more, just to make it even more well-rounded and hardy, but that is another reason I am writing about it today - it is still ongoing! I love the idea of having something that works but that I can tweak and polish to make it better and better. After all... is anything ever really done? :)