Skip to content

Adventures in AI

Recent Scenarios

Those who were at Engage will have seen some of the experimentation I've been doing with AI in the context of VoltScript. In the OGS Jason demoed how I used it to provide code for a loop, correcting it with information about APIs specific to VoltScript. Before my VoltScript session, I showed two videos demonstrating how I've used AI to add value to the VoltScript coding experience, firstly by checking unit tests for code coverage and then by checking code complexity of functions in VoltScript Collections. These are code quality features that are provided out-of-the-box for more standard languages like JavaScript and Java, features that have long been on my wishlist for VoltScript, but features that will not be available in the near future. But GitHub Copilot filled a gap to provide the required information as a stop-gap.

But what was probably not as apparent was another use case where I leveraged GitHub Copilot. In the session with Stephan Wissel we showed how the fetch JavaScript API can be used in more modern browser in place of libraries like Axios. I've used both fetch and Axios a bit, often within Node.js. But running a script doesn't necessarily provide the best demo. A simple JavaScript web app is not necessarily rocket science, but not something I've been used to doing regularly. So I leveraged GitHUb Copilot to create that boilerplate and create a better demo, while writing manually the code that was the main focus of the demo.

Another scenario where I used GitHub Copilot in recent weeks was troubleshooting Java code. The specific code is not important, the use case was. Some rather complex code across multiple Java classes was not providing the required output. Traditional debugging narrowed down the problem to a specific chunk of code. Looking at the code a potential hypothesis spring to mind. Typically, at this point I would resort to Google and search for blog posts. However, in this case I asked a targeted question of GitHUb Copilot concerning what the default behaviour would be for the code. This not only quickly verified that my hypothesis was correct, GitHub Copilot also added value by recommending potential solutions.

Lessons Learned

Those who have attended my sessions and looked at the session materials will know that I'm not interested just in surface details, but what's under the surface. The lessons I took from my experiences of and use of GitHub Copilot also went below the surface. When I presented my learning internally, I drew upon the two pictures below:

Lateran Obelisk
Unfinished Obelisk

Those who were in my session at Engage will recognise the second image. It is the unfinished obelisk, quarried in Aswan in Upper Egypt during the reign of the pharaoh Hatshepsut. It would probably have stood alongside the first obelisk, the Lateran Obelisk, which was in Karnak before being moved to Rome.

So what do obelisks shaped over 3500 years ago have to do with AI?

Well, if you just use AI to generate code that you copy and paste, then move onto the next job, what you get is the Lateran Obelisk - something that looks pretty and makes you look impressive. But all you've done is provided something that looks pretty and added little value.

The unfinished obelisk is not so pretty, but because of where it is, we know obelisks were quarried and shaped where the stone was originally found. We also see the tool marks, which tell us a lot about how they were shaped. It gives us key knowledge to knowing how the obelisks were made and - if we wanted to create an obelisk like the ancient Egyptians - it contributes knowledge to achieve that. Using AI in the same way not only achieves an outcome but can help a developer understand how to achieve the required outcome.

The unfinished obelisk was also a picture I used in my session with Stephan at Engage. One of the things I've come to realise is that there are different kinds of IT professionals - those who want to complete a task and those who want to learn. The former are likely to leverage AI to "do their job for them". It is the attitude of a junior programmer who will only ever be a junior programmer. And these are the kinds of employees that AI could just replace. But the journey I have been on has been driven by understanding how things work the way they do. It's what's required to progress from a junior programmer to a senior developer. It's not just about generating code, it's about architecting solutions and being able to adapt. And this kind of person is more likely to be able to add value, will know when to use AI, and will be able to combine traditional and novel techniques to achieve a better outcome.

It's an interesting time with AI and one which will create transformations in business, just as the previous "artificial intelligence" innovation, the chatbot, has. The key to being able to adapt is the same as it always has - understanding.