Peter B. wrote:This limit will increase in future versions, right?
It probably will.
I've experimented with archives up to 10TB, but the performance is (currently) abysmal. The limiting factor is the de-duplication logic. The work to determine if a data block is a duplicate increases exponentially with the size of the archive. You don't notice the difference up to about 1-1.5TB, but after that things really slow down.
Performance can be improved by caching more information in RAM, but a cache big enough to get a 10TB archive to capture smoothly is more memory than most systems have.
As Mac ships with more and more RAM, and ever faster I/O systems, the useable size of an archive will increase. Until then, consider splitting your archives. Store your virtual machine images or media projects in one archive, and your system and regular documents in another.