Error Request too large for model `llama3-70b-8192`

This is a limitation of the groq AI models, the input token length is to small to correctly allow you to give it 5 articles worth of context.

Solution 1

Use the headings of the article instead.
Load the RAG top 5 article headings.
image

Solution 2

Use just top 1 article, not the top 5
image

Solution 3

Switch to a model with a larger context window.