FAQ
Just wondering if anybody has done/aware about encoding/compressing large
image into JPEG2000 format using Hadoop ?

We have 1TB+ raw images that need to be compressed in JPEG2000 and other
format. Using one beefy machine the rate of compression is about 2GB/hour,
so it takes about >500hours to compress one image.

There is also this http://code.google.com/p/matsu-project/ which uses map
reduce to process the image (
http://www.cloudera.com/videos/hw10_video_hadoop_image_processing_for_disaster_relief)

Search Discussions

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupcommon-user @
categorieshadoop
postedNov 29, '10 at 5:34a
activeNov 29, '10 at 5:34a
posts1
users1
websitehadoop.apache.org...
irc#hadoop

1 user in discussion

PORTO aLET: 1 post

People

Translate

site design / logo © 2021 Grokbase