We are absolutely stoked to announce that we're finally bringing our Preemptive AI for Domino functionality to HCL Verse on-premises.
Why? Well, let’s be honest, with all that has happened in the world of AI, Verse is currently sitting on the sideline like a cheerleader with a bunch of busted pom-poms.
In this upcoming release, we’re packing in the "Big Four": Proofread, Enhance, Translate, and Ask. It’s all powered by Preemptive AI for Domino and plays nice with whatever you’re running—Domino IQ, Ollama, LM Studio, OpenAI, local or remote. You name it, it’ll probably work.
So how does it work?
We did try to go down the official Verse API route, but yeah, nah, no can do. It was just too limiting for what we wanted to achieve. So, we took the "scenic route" and built it via browser extensions for Chrome, Edge, and Firefox. Sorry to the Safari users out there, not for now (send money and it can happen!).
The Firefox version is already done and dusted, and now we’re just cooling our heels waiting for Google to do its thing and send back our signed package... fingers crossed it lands any day now. While we’re waiting for the big wigs to sign off, we’re busy stress-testing the life out of it.
Here is a video of it in action --> https://pecdm3.preemptive.com.au/videos/verse-demo-1.mp4
While we complete final testing, we’d love for a few legends to give it a burl and provide some feedback before we ship it for real
We have set up a sandboxed environment that will do all the processing; you use your Verse client, and our server will do the AI work for you. The AI Server component for this Demo is a Mac Mini M4 Pro, running LM Studio, with the OpenAI GPT-OSS 20 billion model (mlx) loaded.
There is no easier way to try it out.
If you’re keen to have a go, create a ticket at https://preemptive.freshdesk.com, let use know what browser your are using and we’ll sort out the details.
When it is all ready, we will make an offical announcement and make it available to everyone.
Until then, stay frosty.

No comments:
Post a Comment