Private LLM - Local AI Chat
Numen Technologies Limited
Recent Snapshots
| Date | Version | Rating | Ratings # |
|---|---|---|---|
| 2026-04-19 | 1.9.12 | 4.2 | 637 |
| 2026-04-18 | 1.9.12 | 4.2 | 635 |
| 2026-04-17 | 1.9.12 | 4.2 | 634 |
| 2026-04-17 | 1.9.12 | 4.2 | 634 |
| 2026-04-17 | 1.9.12 | 4.2 | 633 |
| 2026-04-17 | 1.9.12 | 4.2 | 633 |
Ranking History
| Date | Rank | Category | Feed | Country |
|---|---|---|---|---|
| 2026-04-20 00:00 UTC | 90 | overall | top-paid | us |
| 2026-04-19 18:00 UTC | 94 | overall | top-paid | us |
| 2026-04-18 18:00 UTC | 72 | overall | top-paid | us |
| 2026-04-18 12:00 UTC | 70 | overall | top-paid | us |
| 2026-04-18 06:00 UTC | 73 | overall | top-paid | us |
| 2026-04-18 00:00 UTC | 73 | overall | top-paid | us |
| 2026-04-17 18:00 UTC | 73 | overall | top-paid | us |
| 2026-04-17 12:00 UTC | 73 | overall | top-paid | us |
| 2026-04-17 06:00 UTC | 71 | overall | top-paid | us |
| 2026-04-17 02:46 UTC | 71 | overall | top-paid | us |
Reviews
里面的模型都跟傻子一样,回答驴唇不对马嘴
What is the point of a local model on an iPad if I can’t add local files to the session. Uploading to Private LLM servers goes against the whole point of a local llm
Excellent app! The only thing I’d like to see is to be able to paste in links to other ollama models, you can include the same disclaimer about memory limits and just add a “+ Custom Model” option under manage models, it would prevent you from having to update the app monthly to incorporate new models. It would really expand the usability of the app. Great work!
Running this on my MacBook Pro, the font is tiny and there’s no way to increase its size. App seems to function OK but it’s not terribly useful like this.
They take a long time to add new models
Good app but has a major flaw. Chat history gets lost if app crashes which happens almost daily. Also the models token context are super small. Around 8k tokens max. But even way under the limit and it still crashes and I lose all chat history. AND YES IT GETS HOT AND AFTER ABOUT 15 seconds it bogs down in speed drastically. (The heat part is not the apps fault just due to hardware limitations. If they fix this the app will be worth the money
Wast of money.
The ai used in this app is only up to date september2021
I figured this would be a little toy that I could use when I’m off-line, but I’m honestly shocked by the performance and how usable it is consistently. I could be totally off-line and get results as good as some of the lower tears of the major AI providers. A++
I saw this application in one of my AI newsletters I receive in my email and thought I’d give it a try. Not impressed. Waste of money. If you have used ANY of the mainstream, up to date, Chat AI platforms, you’ll be disappointed in this application. You get constant repetitive responses, very little memory of conversations and data.
Needs to support: offline datasets, RAG via web or shortcuts app for multiple step generation. Select text feature implementation is very bad
所有模型都下载不了
I asked about the assurances in the privacy policy. The response to whether personal data is collected, “Yes, personal data may be collected in certain circumstances . . . where it is necessary to protect the interests of another individual.” It does collect your data (but may not retain it — but who knows). If it wasn’t collecting your personal identifying data, it couldn’t report you to the “authorities”.
I was never able to converse with AI as every time I went to submit a question, the app crashed my phone back to the lock screen. Downloading AI models from huggingface was clunky and slow.
It literally does nothing I want it to. I’ve asked detailed questions asking for step by step and still nothing correct
App crashes and wonders a lot and speaks in gibberish. Lost hours of conversation in a crash. Would not use this app. Very disappointing
If this is real AI, then as of 12-12-25 it is unable to answer basic questions correctly or clearly. It follows almost no direction, and even when it seems correct it blurs the answers with so much CYA that the answers are of almost no value. Save your money.
How can you not support multiple conversations? Who works on one project at a time? It’s an absolute necessity. You need to explicitly note this before people spend money on this app. Also, be aware LlM are curated by the company that produces the app. You cannot side load LLM that aren’t listed. Additionally, there are models listed as supported on your website that are not supported in the app.
Cool app but it’s incredibly slow and buggy for an app that cannot store any previous chats and that doesn’t allow most modern small LLMs. Also, the advertisement pictures of using custom llms to produce uncensored chat bots or celebrities is dangerous to the youth and also tasteless.
The only thing that I thought might I might like is their shortcuts. Today I found that it didn’t work as expected.