cvWriteRawData fails for large Mat. (Bug #1439)
Description
I have a Mat with large numbe of cols and rows.
Like 1000 cols and several million rows.
When I write this information to the disk...
e.g.
FileStorage fs2(tmp.c_str(), FileStorage::WRITE);
fs2 << "training_descriptors" << training_descriptors;
fs2.release();
This keeps writing the information to the disk, then fails with segmentation fault.
gdb shows:
Program received signal SIGSEGV, Segmentation fault.
0x00007ffff761d400 in cvWriteRawData () from /usr/local/lib/libopencv_core.so.2.3
(gdb) where
#0 0x00007ffff761d400 in cvWriteRawData () from /usr/local/lib/libopencv_core.so.2.3
#1 0x00007ffff761df64 in icvWriteMat(CvFileStorage*, char const*, void const*, CvAttrList)
() from /usr/local/lib/libopencv_core.so.2.3
#2 0x00007ffff761a9b3 in cvWrite () from /usr/local/lib/libopencv_core.so.2.3
#3 0x00007ffff761b3d6 in cv::write(cv::FileStorage&, std::string const&, cv::Mat const&)
() from /usr/local/lib/libopencv_core.so.2.3
#4 0x000000000041034e in operator<< <cv::Mat> (fs=..., value=...)
at /usr/local/include/opencv2/core/operations.hpp:2628
#5 0x0000000000409d93 in make_vocabulary (cfg=0x7fffffffe3c0) at bow.cpp:411
#6 0x000000000040ffd5 in main (argc=29, argv=0x7fffffffe528) at bow.cpp:835
The file grows above 9 GB before failing.
Is this a known limitation? Or simply an error.
I have a 12GB RAM machine.
Associated revisions
fixed writing huge matrices (ticket #1439)
Merge pull request #1439 from ilya-lavrenov:convertTo
History
Updated by Alexander Shishkov about 13 years ago
- Description changed from I have a Mat with large numbe of cols and rows. Like 1000 cols and several m... to I have a Mat with large numbe of cols and rows. Like 1000 cols and several m... More
Updated by Alexander Shishkov almost 13 years ago
- Target version deleted ()
Updated by Alexander Shishkov almost 13 years ago
- Assignee deleted (
Vadim Pisarevsky)
Updated by Vadim Pisarevsky almost 13 years ago
fixed in trunk, r7682.
Note, however, that you will unlikely be able to read this matrix back, because the whole matrix is read in first, so that each matrix is store in CvFileNode structure (that takes 24 bytes on a 64-bit OS). Therefore, to store 10^9 floats, you will need 24 gigs of ram+swap space.
I would recommend to somehow decrease the amount of training data (my using k-means, PCA or other method) and/or split it by multiple files.
- Status changed from Open to Done
- Assignee set to Vadim Pisarevsky
Updated by Alexander Shishkov almost 13 years ago
- Target version set to 2.4.0