2020/07/09

Light Weight Reporting for B&R

Currently I have 2 customers where existing solutions like Veeam Service Provider Console (ex-VAC) or Veeam Enterprise Manager do not really work. One of the customers has a highly secure environment and they only have 2 allowed protocols for each environment: FTP and Email (because they monitor anything in and out of those servers). The second one has an extremely low bandwidth (satellite links) where every byte counts. They resorted to parsing the daily emails but that contains a lot of styling.

So unless you have a very specific use case like this, you probably don't want to use this project. VSPC probably makes more sense if you have distributed multi-tenant environment where Enterprise manager makes more sense if you have a one team shop.

So what is LWR? In reality it is just 2 Powershell scripts you can find on github; one that generates a (compressed) JSON report and one that reads that report (potentially for multiple servers). The most important part is that LWR doesn't define how you transfer these files. They just need to get from one server to another via a mechanism and dumped in a folder. For this blog post, I used FTP (including sample scripts) but they are just samples, not the actually definition. 

So how does it work. Run the site-local on each remote branch on a regular basis. Do edit the script to change the uniqueid per site or use the param statement to modify them on the fly. The result will be a file in a directory every time you run:
"C:\veeamlwr\99570f44-c050-11ea-b3de-0242ac13000x\99570f44-c050-11ea-b3de-0242ac13000x_1594283630.lwr"

The first directory is the repository, the second directory is the unique id and finally the last file is the  _.lwr data file. This is a gziped json file. You can use -zip $false to disable compression but then you need to be consistent for all the sites and the main site. It's just there to troubleshoot or if you want to easily parse the result yourself. By running the script, a new version is created which you must transfer to the main site. For this blog article, I just made a task in the windows scheduler running the sitelocal script and then the ftpsync-sitelocal (ftp sample) to transfer the data to the central site


On the central site, you might want the sync script for the central site manually just to get an update, and then run the central script to get the output


So here you see the sync + first run. You can see that the script downloaded multiple files but central will only use the latest version (this is why the uniqueid  + the timestamp in the filename is important). Clean up at the central site is something you should do yourself as I can imagine use cases where you want to keep those files for a longer history (for example how did the license usage expand over time)

Let's fail a job, run a new upload from the site local, and resync the latest lwr to central



Now you can see there are 2 failed jobs, reflecting the latest status

If you want to verify your license usage, you can also check the licenses with the -mode parameter


That's it. There is nothing more to show or to do. Again, you probably want to use other alternatives like VSPC or Enterprise Manager but in case both are no feasible due to network/security restrictions, please feel free to use and extend to your liking!

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.