Tuesday, July 1, 2025

Getting Domino IQ to use LM Studio

I really like LM Studio – it's absolutely fantastic for running LLMs locally, and its performance is top-notch, and it costs very little. If you're new to it, I highly recommend giving it a try and seeing all the cool things it can do. You won't be disappointed.

I'm also a huge fan of the Mac mini M4 Pro. It's an amazing machine for running LM Studio at the Department level. For $2,200 USD, you get an incredible combination of power and performance.

So what about Domino IQ? It runs LLMs locally, why not use it ? Let's set some expectations.

If you want to run a large, complex model, you're going to need some serious horsepower. In my experience, most Domino servers don't have GPUs; they are normally underpowered for what they are asked to do.

Good news time - with EA3 of Domino 14.5, HCL allowed you to utilize external LLM servers - bingo! So, let's connect Domino IQ to LM Studio which may or may not be running on a Mac Mini M4 Pro :)

Almost perfect, right? Well, there's one inconvenience you'll need to overcome – LM Studio doesn't natively support HTTPS. Domino IQ requires it out of the box, it's a mismatch. 

There's ways around this. You can implement a reverse proxy, like nginx, in front of LM Studio. It does require some additional configuration and complexity, but it is certainly a viable option. But what if you didn't even need to do that ?

Domino IQ allows you to just disable the TLS requirement. Set the Notes.ini DOMIQ_DISABLE_EXTERNAL_TLS=1and you're done.  

Integrating with LM Studio is now super simple – no need to configure credstore, trust roots, DominoMicroCA, blah blah blah. Just point and shoot.

Obviously, disabling TLS means that your comms between Domino IQ and LM Studio won't be encrypted. This might not be a problem for some, but it's definitely something to consider. 

One last thing - once TLS is disabled, Domino IQ doesn't attempt to use TLS for any connections, even if specifically requested. It's all or nothing.

More soon.


2 comments:

Daniel Nashed said...

TLS is really a strong requirement for any data which should be protected if you are connecting outside the local machine. The parameter is really only intended for test environments or for localhost (127.0.0.1) connections.

Please don't turn off TLS. You can run a simple NGINX configuration.
The import of the trusted root should be very straight forward.
You could also run with an exportable MicroCA Cert and there is a simple button to trust CertMgr's own MicroCA root.

I think I will come up with a NGINX configuration which works in front of Ollama and other LLMs which don't provide TLS.

Disabling TLS would be perfectly OK for:

- A local Ollama, LLM Studio instance
- Docker Model runner

and similar local services.
Domino is known for it's excellent security. You have all the options out of the box and the team made it easy to use TLS.

Adam Osborne said...

Thanks for your response, Daniel.

I completely agree with your recommendations and concerns around TLS. Your suggestions for handling certificates are spot on (you're the guy that built so much of this stuff for Domino - thank you!) and definitely the best practice for most scenarios.

I just want to clarify for readers that the main purpose of my original post was to highlight that users do have choices when integrating Domino IQ with LM Studio.

However, with those choices comes responsibility—I sound like Spiderman!

Disabling TLS can simplify things, but it’s not the right solution for every situation. Appreciate your input!