Is it possible to have nested batches (e.g. batch_set()/batch_process())?
Use case is: I scan a directory for JSON files, then for each file (first batch) I read in a bunch of arrays. For each array (second batch) I read in a bunch of keys and values, some of which are further arrays. For each of those sub-arrays (third batch) I continue the process (reading in more keys/values).
Each array in the JSON files will become a node, and each sub-array will be a paragraph item in that node.
Wondering how best to handle this with Batch API...
Comments
I would suspect that not each level needs to be batched. Which is the largest number, the nodes? If so make that the batch and the others can be simple loops.
I also suspect that nesting batches is either impossible or so hard to debug as to be not worth it.
I guess nodes would be the largest number, but hard to tell as they're all split up.
I suppose I could just read each file into a master array, so that way all nodes are together, then batch that, and leave the paragraphs as a loop within each node...
Have you tried Feeds? I'm not sure if this was ported https://www.drupal.org/project/feeds_jsonpath_parser.
By the way, batch keeps track of the time in each http request so if you make each one small enough then it won't time out.
Yep, that was the first thing I tried. I even ported that (and other) modules: https://github.com/backdrop-contrib/feeds_jsonpath_parser :-)
Unfortunately I had troubles with arrays, JSON files and Paragraphs (data got added out-of-order and I couldn't find a solution for it), so I figured there was more flexibility in parsing the files manually (and in order), hence my current situation.