Monday, February 23, 2026

Ready to try AI for HCL Verse? Our Chrome Extension is Live!

I previously blogged about our mission to bring integrated AI to HCL Verse via browser extensions. Today, I am thrilled to announce the official approval and availability of our Chrome extension.

This extension will be a core component of the upcoming Preemptive AI for Domino 3.0 release later this week. To celebrate the launch, we are inviting you to trial it free of charge to experience firsthand how it transforms your Verse workflow.

The Demo Environment

To make testing as seamless as possible, we have established a dedicated sandbox environment. You can continue using your standard Verse client while our backend manages the AI requests.

For this demo, our AI Server is powered by a Mac Mini M4 Pro running LM Studio, utilizing the OpenAI GPT-OSS 20 billion model (mlx). This setup ensures high-performance processing right from our lab to your browser.

How to Get Started

Setting up the trial takes only a few minutes:

- Download & Configure: Follow these detailed instructions to install the extension for Google Chrome.

- Connect to the Service: When prompted for the Remote Service URL, please enter:

https://skynet.preemptive.com.au/ai/ai-proxy.nsf/ProcessWebRequest?OpenAgent

Important Privacy Note

Since this is a demo/test service, please keep the following in mind:

  • Data Privacy: Any data sent to the service is processed on our server. Please avoid sending sensitive, confidential, or personal information.
  • Logging: We purge all requests every 24–48 hours. However, data is logged temporarily to help us track performance and troubleshoot issues.

Feedback & Support

We are eager to hear about your experience! If you encounter any bugs or have suggestions for improvement, please create a support ticket at preemptive.freshdesk.com so our team can track and resolve them.

Let us know how it goes!

Tuesday, February 17, 2026

Want a stunningly good LLM service for only $USD 10 a month?

ChatLLM by Abacus.AI


It's unreal.



If you use NGINX, you might want to take a look at Caddy

If you need a reverse proxy that is simple to set up and ‘just works', check out Caddy


I think you'll like it. 


Behind the scenes of an AI request

 Yesterday, we posted a video demonstrating our new integration with Verse.

https://pecdm3.preemptive.com.au/videos/verse-demo-1.mp4

In the video, we demonstrate the process of proofreading, translating, and asking questions.

When Preemptive AI for Domino is ‘asked to do something" it performs the following steps:

- Creates a AI-Request document

- The AI-Proxy database then processes the request. It, in turn, performs the following steps:

- Looks up the instruction type for the request, for example, AI-Proofread. This contains the prompt that is sent to the LLM; for example, our default prompt for proofreading is:



- looks up details for the service the Instruction should be sent to, e.g., Domino IQ, LM Studio, OpenAI, etc

- Sends the request

- The system processes the response and sends it back to the request document.

Here is a full example of the Demo’s request for proofreading:



Monday, February 16, 2026

This is a ripper - We’re bringing AI to HCL Verse!

We are absolutely stoked to announce that we're finally bringing our Preemptive AI for Domino functionality to HCL Verse on-premises.



Why? Well, let’s be honest, with all that has happened in the world of AI, Verse is currently sitting on the sideline like a cheerleader with a bunch of busted pom-poms.

In this upcoming release, we’re packing in the "Big Four": Proofread, Enhance, Translate, and Ask. It’s all powered by Preemptive AI for Domino and plays nice with whatever you’re running—Domino IQ, Ollama, LM Studio, OpenAI, local or remote. You name it, it’ll probably work.

So how does it work?

We did try to go down the official Verse API route, but yeah, nah, no can do. It was just too limiting for what we wanted to achieve. So, we took the "scenic route" and built it via browser extensions for Chrome, Edge, and Firefox. Sorry to the Safari users out there, not for now (send money and it can happen!). 

The Firefox version is already done and dusted, and now we’re just cooling our heels waiting for Google to do its thing and send back our signed package... fingers crossed it lands any day now. While we’re waiting for the big wigs to sign off, we’re busy stress-testing the life out of it. 

Here is a video of it in action --> https://pecdm3.preemptive.com.au/videos/verse-demo-1.mp4

While we complete final testing, we’d love for a few legends to give it a burl and provide some feedback before we ship it for real

We have set up a sandboxed environment that will do all the processing; you use your Verse client, and our server will do the AI work for you. The AI Server component for this Demo is a Mac Mini M4 Pro, running LM Studio, with the OpenAI GPT-OSS 20 billion model (mlx) loaded.

There is no easier way to try it out. 

If you’re keen to have a go, create a ticket at https://preemptive.freshdesk.com, let use know what browser your are using and we’ll sort out the details.

When it is all ready, we will make an offical announcement and make it available to everyone. 

Until then, stay frosty.

Sunday, January 25, 2026

Summer holidays done and dusted

It was a pretty hot start to Summer holidays 2025 with 43C (110 F) —so camping wasn’t ideal. 



But once we got everything set up, it was all good. We had a couple of wet days, but we still managed to hit the beach or play pickleball almost every day. 




The locals were friendly.  




and there was plenty of time to relax and think,  well, about not much at all… a fantastic reset. 






Already looking forward to the next time we get away. 




Friday, December 19, 2025

Season's Greetings from Downunder!


Wishing you and your loved ones a Happy and Safe Christmas and a wonderful New Year!!! 

Have a good one !


Image by Richard Galbraith https://www.dustydog.com.au