Suggestion/Request: Jpeg2000 improvement

Describe any features you would like to see in future versions of FastPictureViewer Professional.

Suggestion/Request: Jpeg2000 improvement

Postby HotShot on Tue Nov 15, 2011 6:58 am

Hi,

the Jpeg2000 format, although not very popular with the average user, is frequently used in specific science-related fields. Specifically, the format is well-suited for lightning fast random-access to very large datasets (in layman's words, "images of several gigabytes"). The same holds true for medical applications. It is therefore not uncommon to have Jp2 files in the order of hundreds of megabytes - which, once uncompressed, can amount to gigabytes of data. Upon opening with a state-of-the-art decoder, the user can browse to the full-scale area of interest within seconds, or display a thumbnail at arbitrary scale of the whole GB-sized image equally fast. This is the major strength of this wavelet-based format.

Unfortunately, some widespread jpeg2000 decoders take the "easy" (and, imho, plain wrong) approach by implementing blindly the "in-memory decompression" of the whole raster at its max resolution, just like with other common raster formats. From an efficiency viewpoint, this approach is in many cases suboptimal as it is dramatically computation-intensive. At worst it is a total catastrophe... except for the few of us who have hundreds of GB of RAM.

To the point: the current Jpeg2000 decoder bundled with FPV [Codec Pack] appears to take this rather simplistic approach, which is truly disappointing compared to the other blazing fast codecs. As a result, any browsing of a folder with "reasonably-sized" jp2 files (50Mpx-1Gpx dimensions, hundred megabytes) results in a long-lasting lockup of the computer. Hungry dllhost process and dramatic ram consumption make the system unusable for several minutes, while the codec routines decode gigabytes of information only to compute a small thumbnail... Note that a clever codec does the same within milliseconds, by only extracting the lowest-res (or most appropriate) layer of the file.

Request: would it be possible to implement either a "clever" decoding algorithms for this Jpeg2000 [baseline] format, to allow fast browsing, thumbnail generation and display of files of any dimensions ? Or implement some decent safeguard against the processing of very large jpeg2000 datasets ? (a wild guess: limiting the decoding to "<1 GB uncompressed max-res" files should be user- and computer-friendly enough).

--

For information, such clever implementations are already available. The hitherto most feature-rich code comes from ERMapper/ERDAS and is successfully used by its various geospatial products. Their SDK is available under rather permissive licenses (IIRC), however I'm not sure how it would fit in a WIC decoder... and what the financial implications would be for FPV. On the other hand, other "freely available" examples of clever code may exist elsewhere.
(As a sidenote, I put special emphasis on this ERDAS SDK since it would conveniently allow decoding the very similar ECW format - another wavelet-based coding algorithm commonly used for aerial and satellite imagery. FPV products would then likely meet a few new happy customers within the GIS community... ;))

Sorry for the lengthy message - just wanted to make things as clear as possible. Any answer, be it a simple 'No way' is welcome :D
HotShot
 
Posts: 17
Joined: Wed Sep 09, 2009 10:51 am

Re: Suggestion/Request: Jpeg2000 improvement

Postby Axel on Tue Nov 15, 2011 1:43 pm

Indeed the JPEG2000 codec unpacks the image in memory and this has implications if you have several images of "gigapixel size" in the same folder, partly because of the full in-memory decoding but also because Explorer creates thumbnails in parallel, thus tries to load several images concurrently.

If you have a folder full of gigapixel images, this has a big impact on memory usage and the computer will surely crawl for a while if the amount of memory requested is larger than physical RAM (but everything should eventually get back to normal and, since thumbnails are cached, this slowdown should only occur once).
Axel
Site Admin
 
Posts: 794
Joined: Thu Nov 06, 2008 1:54 am
Location: Geneva, Switzerland


Return to Suggestions



cron