I’m on a GNU/Linux and I have a second gfx card that is nvidia but not plugged in to anything other than my Motherboard, can I use this still for cuda based ai stuff?

  • PeterPoopshit@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    edit-2
    7 months ago

    Actually I got an nvidia card working on easy diffusion on Debian. The barrier for getting a text chat ai working with gpu acceleration is actually the fact that I don’t have the patience to deal with all that python venv nonsense so I use llamacpp. It runs in c++ which means no python dependencies to fuck you with at the cost of slower cpu-only generation.

    Easy Diffusion just happens to be simple enough that I could actually figure out how to get it to work (it’s in python and needs a virtual environment) but it’s a different story for the text ais.

    If you actually had the patience and knowledge to deal with all the python issues and/or had a distro that makes it easy (different distros deal with pip differently), I don’t doubt you’d be able to get Nvidia card acceleration working on some text chat ai.