Apple debuts OpenELM, a family of language models with 270M, 450M, 1.1B, and 3B parameters, designed to run on-device, pre-trained and fine-tuned on public data (Shubham Sharma/VentureBeat)
Shubham Sharma / VentureBeat:
Apple debuts OpenELM, a family of language models with 270M, 450M, 1.1B, and 3B parameters, designed to run on-device, pre-trained and fine-tuned on public data — Just as Google, Samsung and Microsoft continue to push their efforts with generative AI on PCs and mobile devices …
from Techmeme https://ift.tt/Ll3Ow47
Comments
Post a Comment