Llama3 api

Hi, I am starting to see people using llama3 to generate article. It seems to be better than any model out there at the moment. It generates far better content than gpt4 and execute instructions better than other llm.

Would you look into implementing?

Its accessible via meta-llama/Meta-Llama-3-70B-Instruct - Demo - DeepInfra

Inside SCM
image

If you have a MAC you can install the models on your computer and access it using SCM inbuilt openai API.

I also see that its available to nodejs, which means is possible for SCM to download and run it on your machine for free.

However there are some steep memory requirements ie 64GB ram to run it locally.
image

Possible to use open ai alt now and also LM studio to run llama on your computer.

How to connect SCM to LM Studio (llama3 etc) to run offline AI models for FREE