AI prompt threads

1. The problem

AI prompt threads are set to 10 threads, but only 1 thread is actually running.

2. Screenshot or task log of the problem

Threading right now is for prompts in 1 article.

How did you setup your tasks?

Eg
10 keywords
1 article per keyword
1 prompt per article?

The working environment is set to generate 10 articles per keyword.

I have been using 10 threads for article writing without any issues until now. However, after the recent update, it is no longer working. Please check. The AI I used is OpenAI.

Send me the task export. I need to look. At the settings.

My Settings Screen

Based on the above settings, the previous working method was as follows:

  • 10 projects would run simultaneously.
  • If I set it to write 10 articles per keyword, each project would call requests in parallel to generate the articles.

However, after the update, when writing with the same keyword, it no longer processes requests in parallel.
Even though it’s the same keyword, the system waits for one request to finish before starting the next one.

OK understood.

Let me make some adjustments.

Added 2 new settings to control the speed of AI article generation.

AI article threads = How many AI articles to start simultaneously.

AI prompt threads = How many prompts to start simultaneously.

This requires 2 settings.

AI article count

AI article count setting is important for AI article threads.

If you article count is 1, you will never need to multi thread.

AI prompt count

The number of prompts in your AI article template will change how threading works.

If you only have one prompt, then no threading will be used.

If you are using a template with multiple prompts…

Then the AI prompt thread count can be used to send all those prompt requests simultaneously.

AI Total thread count

The 2 settings interact with each other by multiplication.

ie.
AI article threads = 5
AI prompt threads =5
Max threads used = 5 x 5 = 25

So 25 requests will be made to your AI model at the same time.

This is good if your AI model supports multiple requests, but other models might start to rate limit you.

Experiment with the 2 settings to find the best set up for you.