[xep-support] Size issue

Nikolai Grigoriev grig at renderx.com
Tue Mar 26 11:25:09 PST 2002


Craig,

> I am trying to process a very large xml file.  Is it possible for XEP to
> handle this?  When processed, it would make about a 2000 page pdf file.
> Would using a more powerful processor help or is there a maximun limit
> that XEP will handle?

The size of the file XEP can handle is limited by the RAM available. As you
probably know, XEP builds an in-memory representation of the document, and
therefore needs huge amounts of Java heap to produce huge documents. For
medium-complexity texts, XEP consumes ca 0.3-0.5 MB/page (the 400-page XSL spec
could be compiled in 160MB of Java heap). However, in certain cases memory
consumption is much more elevate than this: in particular, long tables and
fine-resolution images lead to a burst in memory usage. I believe that you will
need around 1GB of RAM to format 2000-page document reliably.

There are several expedients to speed up the processing and make it more
streamlined. In particular, if your document is loosely tied (that is, composed
from several chunks with no cross-references between them), it is possible to
perform most of the formatting on each chunk separately, and concatenate the
results only at the generation phase. Do your 2000 pages form a single document?
maybe they are a print batch of several independent publications? In the latter
case, a simple optimization would suffice to make memory consumption depent only
on the size of the biggest chunk, not the overall batch size.

Best regards,
Nikolai Grigoriev
RenderX




> -------------------
> By using the Service, you expressly agree to these Terms of Service
http://www.renderx.com/tos.html

-------------------
By using the Service, you expressly agree to these Terms of Service http://www.renderx.com/tos.html



More information about the Xep-support mailing list