This SAP BusinessObjects Data Services setting limits the number of al_engine processes that can be spawned by a job. This is a per-job value, not an overall job server processes value. This means that if ten jobs are running at the same time and the value is set to 8 (the default) then there is the possibility of 80 al_engine processes. Jobs that run many parallel Dataflows could be positively impacted by changing setting. A Workflow doesn't spawn an al_engine process but a Dataflow does. The job itself takes up one of the al_engine processes. The Workflow shown below executes eighteen child Workflows in parallel.
Each of the 18 Workflows contains at least one Dataflow. Only seven of the Dataflows will run at the same time if the Maximum Number of Engine Processes setting is using the default value of eight. Seven Dataflow al_engine processes plus one Job al_engine process equals eight. Dataflows number eight and higher will queue up. As soon as one of the first seven Dataflows completes then the next Dataflow in the queue will run.
For this particular Workflow a value higher than the default would be beneficial. As long as the eighteen Dataflows do not swamp the database server or use large amounts of memory on the job server there shouldn't be a problem.
Some Dataflows execute queries that take a long time to return the first row of the result set. Configuring a larger number of al_engine processes for execution would allow Dataflows with quick responding result sets to begin and perhaps finish processing before the longer running Dataflows have received the first row of their result set. However, it is unlikely that only one job is running. If a worst case assumption is made that all of the Dataflows within the eighteen Workflows use the maximum amount of memory (about 4 GB on a 64 bit O/S) and the Maximum Number of Engine Processes setting is 19 then there could be an attempt made to use almost 80 GB of RAM just for the Dataflows. This would likely bring the Job Server operating system to its knees or cause the Dataflows to fail with memory errors.
A reasonable compromise needs to be made to balance individual job execution against overall job server performance.
The Maximum Number of Engine Processes setting is dynamic. The next time a job is executed it will pick up the setting. The job server does not need to be restarted to make the setting effective. The setting can be changed by opening Data Services Designer and going to Tools/Options/Job Server/Environment. Designer must have a valid connection to the job server. The setting can also be changed by editing the DSConfig.txt file. The line to edit is MAX_NO_OF_PROCESSES found in the AL_Engine section.