Welcome to the forum, Guest

TOPIC: Memory Usage

Memory Usage 10 months 4 weeks ago #4980

typh17

Offline

Fresh Boarder

Posts: 9

Karma: 0

Actually, I think it's a consequence more than a bug, but the "issue" is there so I'm highlighting it.

YAC Reader 9.6.2 (last release), Windows7 SP1 x64.

The issue happens when I load a bunch of hires comics (several YAC instanes): since (seems to me) YAC Reader caches the whole comic, I run out of memory fast.

You may say that it's uncommon to read more than a comc at time, but this happens even opening only one big .cbr; the worst case happens when I read from a (big) folder, it used 1GB of ram like nothing.

So I think something is wrong, a couple of pages in advance should be fetched, not everything...
I suppose this is linked to the "page flow" too.

Considering the quality (resolution) of .cbr is going up and up, I think this can mbe a problem.

I compared this with anoter reader (Honeyview): there is an option that limits how muc memory is used for caching, whis "solves" the issue... but it's not YAC Reader! And it's not that I'm asking to implement that, I just want to point out the hig resouce consuption in some cases.

Now, I don't know if there's a bug somewhere or it's intended to work like that, but eating all RAM available is not "nice" (in unix terms).

Can you please fix this behaviour? Thanks.
The administrator has disabled public write access.

Memory Usage 10 months 4 weeks ago #4981

selmf

Offline

Developer

Posts: 553

Thank you received: 115

Karma: 11

I'm very well aware of the problem. YACReader always decompresses or renders the whole comic into RAM, which can be a real problem with recent HD comics in the gigabyte (compressed) range. Sadly, there is no easy fix for this as we would need to fundamentally change the way comics are processed, which would negatively impact performance.
CBx files basically are just renamed compressed archives of various formats; formats which often are not very well suited for memory-friendly processing or display. We have some ideas how to address this issue long-term, but this has to be supported by the decompression backend and currently this support won't happen unless we implement it ourself, which isn't trivial.

We might be able to implement a less-than-optimal solution a bit sooner, but this will always be a trade-off between performance and memory usage.

@luisangelsm, what is your take on this?
My answers are not necessarily official YACReader statements but mostly represent my own opinion in technical matters.
The administrator has disabled public write access.

Memory Usage 10 months 3 weeks ago #4982

Luis Ángel

Offline

Administrator

Posts: 1563

Thank you received: 335

Karma: 30

The behaviour is by design, comics quality beyond 4k won't happen any time soon, and RAM availability will also go up, even phones have tons of RAM nowadays.

How much RAM do you have? And how many comics do you open at once?

One scenario where this can be a problem is when running the headless server in embedded devices, so having a setting for limiting the RAM usage probably makes sense, but so far I did not hear much about this as an issue.
Contribute to the project becoming a patron: www.patreon.com/yacreader
You can also donate via Pay-Pal: t.ly/ODFV
The administrator has disabled public write access.

Memory Usage 10 months 3 weeks ago #4983

selmf

Offline

Developer

Posts: 553

Thank you received: 115

Karma: 11

I think this rarely is an issue for "regular" users, but I have certainly been hit by this issue more than once ;)

Some of the comic files I have tested are larger than 4 GB. Just a few, but the actual amount of RAM this can occupy when decompressed in-RAM is not trivial. Add a browser and maybe a music player and you are out of RAM, and on Unix systems this can be worse than on Windows. I've had my system crashing more than once.

The problem is … most comic book files (cb*) use solid compression. In simple words that means that you can't decompress the last file in an comic book archive without decompressing the file before it, and this goes all the way up to the first file in the archive. Even worse - the files in the archive probably aren't saved in the order you would read them, which means you'd have to decompress the files 'out of order'.

YACReader decompresses into RAM, which works very well for cb* which have a size in the hundred megabyte range, but it can be a real issue in the gigabyte range.

Technically there are two ways to solve this. One way is to decompress the files onto the harddisk instead of RAM, the other way is to only keep a few pages forward and back in memory and to decompress the rest on-demand. I very much would prefer the second way, but in order to allow for a acceptable access time for all pages we would need a mechanism to quickly extract pages at a position before and after the current reading position.

This could be solved by using snapshots of the decompression state/dictionary, but no decompression library I know of does support this (yet). It is very likely we'll have to implement this ourselves.
My answers are not necessarily official YACReader statements but mostly represent my own opinion in technical matters.
The administrator has disabled public write access.

Memory Usage 10 months 3 weeks ago #4984

typh17

Offline

Fresh Boarder

Posts: 9

Karma: 0

The high memory consumption happens when loading a folder too: in that case compression should not matter and caching should be more flexible.

I initially loaded a big archive, noticed the memory usage was too high, then decompressed it and loaded that folder. I was surprised the memory usage was high like that too.
Well, I was using YAC viewer as a image viewer more than a comic viewer, anyway...

I have 8GB of RAM, so for small files it's ok; the issue happened with the firefox (150 tabs), thunderbird, pycharm and excel active, my media player is trascurable) and then I opened that file.
Taht is a common situation for me (browser, mail client, ide, some doc) when I want to read something to relax a bit, pausing what I was doing.

I may also open 3 comics at once, like I did for "testing" this issue.

It's a shame the only 2 programs I know that parse (big) archives "incrementally" have no sources avail (ACDSee, HoneyView), as it would be helpful. HoneyView portable has a \data folder with some dlls, dunno if that can give any hints on how they achieve it.
The administrator has disabled public write access.

Memory Usage 10 months 3 weeks ago #4985

selmf

Offline

Developer

Posts: 553

Thank you received: 115

Karma: 11

The problem in the case of folders is probably that YACReader always processes the whole comic and keeps the pages in RAM.

We probably could use a "lazy loading" mechanism for zip, pdf and directories as these formats allow random access to the pages, but archives with solid compression (rar, 7z) don't allow this. Sequential processing would be possible, but these formats usually reorder the files in the archive during compression, so we'd have to take account for that, which would be … tricky.
My answers are not necessarily official YACReader statements but mostly represent my own opinion in technical matters.
The administrator has disabled public write access.

Memory Usage 10 months 3 weeks ago #4987

typh17

Offline

Fresh Boarder

Posts: 9

Karma: 0

I'm all for lazy access on folders and zip archives... not so much for pdf, since I often need to extract data from those and I fell pain only thinking how messy the pdf format is...

Personally I don't see much space gain compressing images with solid archives, since they're supposedly already compressed with a specialized filters. I could easily make a script that uncompressess and recompresses to non solid forma.

Anyway thanks for taking your time and answering my post, I appreciate it.
The administrator has disabled public write access.
Powered by Kunena Forum