xml_split - cut a big XML file into smaller chunks
It can split at a given level in the tree (the default, splits children of the root), or on a condition (using the subset of XPath understood by XML::Twig, so "section" or "/doc/section").
Each generated file is replaced by a processing instruction that will allow "xml_merge" to rebuild the original document. The processing instruction format is "<?merge subdocs= :<filename> ?>"
File names are <file>-<nb>.xml, with <file>-00.xml holding the main document.
defaults to 1
xml_split -c <section> will put each "section" element in its own file (nested sections are handled too)
Note that at the moment this option is a lot slower than using "-l"
<nb> is a sequence number, see below "--nb_digits" <ext> is an extension, see below "--extension"
defaults to the original file name (if available) or "out" (if input comes from the standard input)
if more digits than <nb> are needed, then they are used: if "--nb_digits 2" is used and 112 files are generated they will be named "<file>-01.xml" to "<file>-112.xml"
defaults to 2
defaults to the original file extension or ".xml"
Note that this option can slow down processing considerably (by an order of magnitude) when generating lots of small documents
xml_split foo.xml # split at level 1 xml_split -l 2 foo.xml # split at level 2 xml_split -c section foo.xml # a file is generated for each section element # nested sections are split properly
It would be a good idea to first check that indeed the whole document is not loaded in memory!
using entities, which would seem the natural way to do it, doesn't work, as they make it impossible to have both the main document and the sub docs to be well-formed if the sub docs include sub-sub docs (you cant have entity declarations in an entity)