Wondering about services to test on either a 16gb ram “AI Capable” arm64 board or on a laptop with modern rtx. Only looking for open source options, but curious to hear what people say. Cheers!

  • @couch1potato@lemmy.dbzer0.com
    link
    fedilink
    English
    2
    edit-2
    4 days ago

    I spun up ollama and paperless-gpt to add ai ocr sidecar to paperless-ngx. It’s okay. It can read handwritten stuff okayish, which is better than tesseract (doesnt read hand writing at all), so I throw handwritten stuff to it, but the difference on typed text is marginal in my single day I spent testing 3 different models on a few different typed receipts.

      • @couch1potato@lemmy.dbzer0.com
        link
        fedilink
        English
        23 days ago

        I tried minicpm-v, granite3.2-vision, and mistral.

        Granite didn’t work with paperless-gpt at all. Mistral worked sometimes but also just kept running sometimes and didn’t finish within a reasonable time (15 minutes for 2 pages). minicpm-v finishes every time, but i just looked at some of the results and seems as though it’s not even worth keeping it running either. I suppose maybe the first one I tried that gave me a good impression was a fluke.

        To be fair, I’m a noob at local ai, and I also don’t have a good gpu (gtx1650). So these failures could all be self induced. I like the idea of ai powered ocr so I’ll probably try again in the future…

        • @kiol@lemmy.worldOP
          link
          fedilink
          English
          122 hours ago

          I find your experiments inspired. Thank you! I’m learning about this myself on an rtx and excited to discuss on my little podcast.james.network one of these days. Been using paperless minus the AI functionality so far. About to start testing different AI services on an arm64 device with 16gb ram that claims some level of AI support; will see how that goes. Let me know if there are any other specific services/models you’d recommend or are curious about.

          • @couch1potato@lemmy.dbzer0.com
            link
            fedilink
            English
            117 hours ago

            Sure, and let me know how it goes for you. I’m on a dell r720xd, about to upgrade my ram from 128 to 296 gb… don’t want to spend the money for a new gpu right now.

            I’ll report back after I try again.