Log details

Oct 2, 2012 at 1:48 PM

In your professional use of Wolfpack, how do you deal with ScalarCheck results? I understand limiting the check to a Count(*) is important for performance, but you only have binary details. There is an error or not.

Do you manually follow up anytime the result is an error or do you keep the check criteria so specific that you understand the impact of any failed check without further details? Have you ever explored returning record details?

For example: I'd like to check a dozen IIS sites on a cluster of servers for any 500 errors. I could do that with a single LogParser Select, but then I don't know what site on what server the errors is on. I could create 50+ conditions to cover all possible combination of sources, but that seems inefficient.

Thanks.

Coordinator
Oct 2, 2012 at 2:49 PM

There was a similar question posted earlier in the year here: http://wolfpack.codeplex.com/discussions/389588 ...have a read...

The problem is that the logparser query can generate a lot of rows of data and there is no real way of transporting that around in memory really. I quite like the idea of dumping the actual report data that tripped the alert to disk then have the email publisher automagically attach it to the email containing the alert - the other publishers could also take the appropriate action too....eg,

Growl publisher, include a fileshare link to the report file.

Sql publisher could actually hoover the report into a TEXT blob etc

The alternative solution given to the other user was to use the beta Powershell check to give you ultimate flexibility in what you do and how you package the results but I think the auto-attachment of the results is a great solution too and I'd be keen to implement this if it met your requirements?

Cheers,

James

Oct 2, 2012 at 4:25 PM

I'm actually looking at Wolfpack to replace several custom Powershell scripts I have running now. So, I might be able to rewrite them to work with the framework. I've actually already configured my Powershell instance to target .Net 4 and it's a requirement for some of my scripts.

But, I also think the auto-attachment option would be a very useful feature that would probably reduce the amount of custom code anyone has to write. 

What about making the text blob available via the WCF ServiceHost instead of making a dependency on a fileshare?

Coordinator
Oct 2, 2012 at 9:01 PM

Ok, so saving the report data looks like the best way to do this and exposing it via WCF is a very good idea too. I am on the verge of rewriting a new web api to replace the WCF one as this has robustness and security issues...this new version will include a method to return/stream the report data associated with an alert.

Thanks for your input shaping this!

J