<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>
	Comments on: How to Enable AI Document Editing on Ubuntu with ONLYOFFICE and Ollama	</title>
	<atom:link href="https://www.tecmint.com/ubuntu-ai-document-editing-onlyoffice-ollama/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.tecmint.com/ubuntu-ai-document-editing-onlyoffice-ollama/</link>
	<description>Tecmint - Linux Howtos, Tutorials, Guides, News, Tips and Tricks.</description>
	<lastBuildDate>Tue, 27 Jan 2026 06:05:51 +0000</lastBuildDate>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	
	<item>
		<title>
		By: Ravi Saive		</title>
		<link>https://www.tecmint.com/ubuntu-ai-document-editing-onlyoffice-ollama/comment-page-1/#comment-2383996</link>

		<dc:creator><![CDATA[Ravi Saive]]></dc:creator>
		<pubDate>Tue, 27 Jan 2026 06:05:51 +0000</pubDate>
		<guid isPermaLink="false">https://www.tecmint.com/?p=61468#comment-2383996</guid>

					<description><![CDATA[In reply to &lt;a href=&quot;https://www.tecmint.com/ubuntu-ai-document-editing-onlyoffice-ollama/comment-page-1/#comment-2382894&quot;&gt;Norm Green&lt;/a&gt;.

@Norm,

That error usually means &lt;strong&gt;ONLYOFFICE&lt;/strong&gt; can’t communicate with &lt;strong&gt;Ollama&lt;/strong&gt;, not that the model is broken.

Since the &lt;strong&gt;LLM&lt;/strong&gt; works in the terminal, check that your &lt;code&gt;API URL&lt;/code&gt; in &lt;strong&gt;ONLYOFFICE AI&lt;/strong&gt; settings - make sure it points to &lt;code&gt;http://SERVER-IP:11434&lt;/code&gt; (not https and not &lt;code&gt;127.0.0.1&lt;/code&gt; if Docker is used).

Next, Test Ollama API:
&lt;pre&gt;
curl http://localhost:11434/api/tags
&lt;/pre&gt;
If this fails, the connection might be blocking in firewall and selinux.
&lt;pre&gt;
sudo firewall-cmd --add-port=11434/tcp --permanent
sudo firewall-cmd --reload
sudo setenforce 0
&lt;/pre&gt;
If it works after this, SELinux is blocking it.]]></description>
			<content:encoded><![CDATA[<p>In reply to <a target="_blank" href="https://www.tecmint.com/ubuntu-ai-document-editing-onlyoffice-ollama/comment-page-1/#comment-2382894">Norm Green</a>.</p>
<p>@Norm,</p>
<p>That error usually means <strong>ONLYOFFICE</strong> can’t communicate with <strong>Ollama</strong>, not that the model is broken.</p>
<p>Since the <strong>LLM</strong> works in the terminal, check that your <code>API URL</code> in <strong>ONLYOFFICE AI</strong> settings &#8211; make sure it points to <code>http://SERVER-IP:11434</code> (not https and not <code>127.0.0.1</code> if Docker is used).</p>
<p>Next, Test Ollama API:</p>
<pre>
curl http://localhost:11434/api/tags
</pre>
<p>If this fails, the connection might be blocking in firewall and selinux.</p>
<pre>
sudo firewall-cmd --add-port=11434/tcp --permanent
sudo firewall-cmd --reload
sudo setenforce 0
</pre>
<p>If it works after this, SELinux is blocking it.</p>
]]></content:encoded>
		
			</item>
		<item>
		<title>
		By: Norm Green		</title>
		<link>https://www.tecmint.com/ubuntu-ai-document-editing-onlyoffice-ollama/comment-page-1/#comment-2382894</link>

		<dc:creator><![CDATA[Norm Green]]></dc:creator>
		<pubDate>Fri, 23 Jan 2026 18:24:08 +0000</pubDate>
		<guid isPermaLink="false">https://www.tecmint.com/?p=61468#comment-2382894</guid>

					<description><![CDATA[Hi Ravi,

I used your article to do the same on Fedora 42 using a local &lt;strong&gt;llm&lt;/strong&gt; and &lt;strong&gt;Ollama&lt;/strong&gt; but when I query the chatbot I get &quot;&lt;strong&gt;Something went wrong, please try reloading the conversation&lt;/strong&gt;&quot;.

&lt;strong&gt;LLM&lt;/strong&gt; is running on command line fine, do you have any suggestions?]]></description>
			<content:encoded><![CDATA[<p>Hi Ravi,</p>
<p>I used your article to do the same on Fedora 42 using a local <strong>llm</strong> and <strong>Ollama</strong> but when I query the chatbot I get &#8220;<strong>Something went wrong, please try reloading the conversation</strong>&#8220;.</p>
<p><strong>LLM</strong> is running on command line fine, do you have any suggestions?</p>
]]></content:encoded>
		
			</item>
	</channel>
</rss>
