Skip to content

Index

Statistics Publishing and Reporting Part Three: Using Micrometer

Part One: Domino and Statistics
Part Two: Prometheus
Part Three: Micrometer and Prometheus Part Four: Micrometer and Composite Registries

The Easy Way

In the previous two parts I gave some background on metrics and then described my first pass at coding metrics, quite frankly the hard way. The elephant in the room should be apparent: the metrics output was manually coded for a specific reporting tool.

Statistics Publishing and Reporting Part Two: Statistics for Prometheus

Part One: Domino and Statistics
Part Two: Prometheus Part Three: Micrometer and Prometheus
Part Four: Micrometer and Composite Registries

In the last blog post I covered the differences between different statistical reporting tools. I can't speak for Panopta covered in the recent HCL webinar. I suspect they're using "pull" (they install an agent on the Domino server) and I suspect they're using one of the traditional reporting databases (their monitoring covers time periods, so there must be a database; and their offering is based around pre-built dashboards for various products, so no necessity for their USP to build their own reporting repository).

Statistics Publishing and Reporting Part One

Part One: Domino and Statistics Part Two: Prometheus
Part Three: Micrometer and Prometheus
Part Four: Micrometer and Composite Registries

Domino V10 and Statistics Publishing

One of the big additions in Domino V10 was statistics publishing, initially focused on New Relic. But as Daniel Nashed showed this can easily be re-routed to other locations, for example a Domino database. When I worked at Intec I tried the New Relic reporting on Domino early on and was very impressed at what was provided. My response wasn't focused on what statistics were delivered - what is outputted is not the important factor, it can easily be change. My opinion came from how easy it was to set up. New Relic itself is straightforward, but what needed to be done on the Domino side was even easier - a few Notes.ini settings, restarting and the statistics flowed. Since the days of Embedded Experiences I have been convinced that ease of implementation is critical for adoption, and adoption is key to value for effort.

Lessons Learned from JUnit and Refactoring

JUnit testing just makes sense. But writing tests is certainly a skill and your tests can have a big impact on how you structure your code. Sometimes a sensible bit of refactoring can have a large impact, particularly if the code or unit tests were not written in the best way.

It is inevitable that some code will need to interact with a database, and that database will not be available when the unit tests run. There are multiple approaches for handling code that cannot run in a test.

Developing RunJava Addins for Domino

Most Domino developers use Windows because that's the only platform Domino Designer runs on. For most of my application development life, my main device has been a Dell laptop of some variety for this reason. For almost a decade now I've also been running a Windows Domino server because Domino Designer local preview is not an effective test environment for a Domino web application. If you're using source control you are also usually testing locally unless you're developing cloud functions. So for development, you typically want a Domino server, and if you're using Domino Designer, the easiest server install to develop against is a Windows Domino server. If you want Linux on Windows and you're using Windows Professional, Docker is a sensible approach, if you take some time to understand port access from Docker.