Over the weekend we lost another long-standing member of the Domino community, Nathan T Freeman. Nathan was outgoing, often controversial, but passionate about open source and helping others. Everyone who met him will have stories about him. But I know he is one of the individuals I have to thank for being where I am today.
In my last blog post I talked about challenges we had to overcome as a team with regard to caching of constants. But a bigger challenge we hit was caching of design elements.
Part of the solution we built required copying design elements from one database to another. Part of the beauty of Domino is that everything is a Note - including design elements. Design elements are just Notes with a special flag. So just as you can copy a document from one database to another by getting a handle on the note, you can also copy a design element from one database to another by getting a handle on the design note. The API is exactly the same - Call NotesDocument.copyToDatabase(targetDb).
Recently I've been involved in a project with a lot of LotusScript. As a team our approach has been to structure the code according to best practices and leveraging what we've learned from other languages. Standards are always good, but there are always peculiarities that impact what you do. The crucial skill is to be able to work out what is happening when the standard ways don't produce expected results. And most importantly, work out how to work around them.
There are a number of challenges when it comes to two-way REST and Domino. But one of the biggest challenges for manipulation between NotesDateTime objects and JSON is timezone handling. There is an Product Ideas request to provide serialization / deserialization between Domino objects and JSON strings, which surprisingly only has 31 votes, but it's not there yet. So for Volt MX LotusScript Toolkit, this needs handling within the toolkit itself.
Earlier this week Jason Roy Gary announced the Volt MX LotusScript Toolkit. It's important to put some background to manage expectations. There will be an OpenNTF webinar on December 17th where we will explain more about our aims for the project and provide a call-to-arms to the community to join us driving this forward. I encourage everyone to attend if you're interested in using Agents outside the Notes Client or a Form's WebQueryOpen and WebQuerySave methods. But in advance, let's cover some questions I expect people to have.
If you're developing an API, the best tool to test with is Postman. When I initially used Postman, I only used it for basic individual tests. If you followed my blog at Intec, that also led me to recommend and to use Node-RED for creating a test flow. However, over the last few months I've learned about significantly more functionality of Postman, which starts to become extremely relevant when using Postman collections for other purposes.
Since joining HCL Labs my focus for Domino development has been Domino APIs, with Project Keep. Obviously there is little point using a REST API against a Domino database in an XPages application or LotusScript agent. Consequently, application development has been almost exclusively outside of XPages. This has reinforced key differences between Domino development and other application development frameworks. Now was a good time to cover this, particularly following Chris Toohey's excellent blog post "Was XPages a waste of time?". I've always been vocal, that my experience of XPages was not a waste of time. But I'm very much one of those who took a huge step further, and took a lot of time to understand very deeply how XPages worked and how other frameworks were similar or different.
In the last part I covered outputting metrics. If you have an existing HTTP server or wish to create a standalone server, it's easy to output the metrics, as shown on the Vert.x samples:
In the previous two parts I gave some background on metrics and then described my first pass at coding metrics, quite frankly the hard way. The elephant in the room should be apparent: the metrics output was manually coded for a specific reporting tool.
In the last blog post I covered the differences between different statistical reporting tools. I can't speak for Panopta covered in the recent HCL webinar. I suspect they're using "pull" (they install an agent on the Domino server) and I suspect they're using one of the traditional reporting databases (their monitoring covers time periods, so there must be a database; and their offering is based around pre-built dashboards for various products, so no necessity for their USP to build their own reporting repository).