Hello, sorry if the question seems a little silly, i’m not too versed in this. Is it possible to expose the ollama instance in the studio to the web or to connect through ssh, so can I use the server for another project locally as code assistant in the VS code continue extension?
Having the dev project and the llm framework in the same studio seems a little clunky to me, or do you think there is a better solution for this?