![](/static/253f0d9b/assets/icons/icon-96x96.png)
![](https://lemmy.ml/pictrs/image/a64z2tlDDD.png)
yeah- mistral and llama are the ones you want to looks at- grok is too big to run even on enterprise cards (and sucks worse than a model my pi can run)
yeah- mistral and llama are the ones you want to looks at- grok is too big to run even on enterprise cards (and sucks worse than a model my pi can run)
don’t let this turn into r/196
3ds my beloved :3
i think thats transmascs
-> i‘m in this rule and I don’t like it
why yes, I do enjoy being bottomless around the house
80% of r/thinkpad
(the other 20% are femboys, catgirls, enby or trans)
have you heard of ultrakill?
ok, fair; but do consider the context that the models are open weight. You can download them and use them for free.
There is a slight catch though which I’m very annoyed at: it’s not actually Apache. It’s this weird license where you can use the model commercially up until you have 700M Monthly users, which then you have to request a custom license from meta. ok, I kinda understand them not wanting companies like bytedance or google using their models just like that, but Mistral has their models on Apache-2.0 open weight so the context should definitely be reconsidered, especially for llama3.
It’s kind of a thing right now- publishers don’t want models trained on their books, „because it breaks copyright“ even though the model doesn’t actually remember copyrighted passages from the book. Many arguments hinge on the publishers being mad that you can prompt the model to repeat a copyrighted passage, which it can do. IMO this is a bullshit reason
anyway, will be an interesting two years as (hopefully) copyright will get turned inside out :)
ohno my copyright!!! How will the publisher megacorps now make a record quarter??? Think of the shareholders!
overseers??? omg SCP reference?!??
How to make money: Step 1. Make 50 billion different pins (based) Step 2. ??? Step 3. Profit
says who? This isn’t reddit.