Topic: How quickly to read big files?
Wrote the wonderful program which reads files in the size ~500 (in a file ~ 10 million lines). showed that about 2/3 operating times occupies I/O. Therefore I tried to optimize operation, testing 3 variants of the program (in brackets program runtime in two flows is specified):
1. A classical method With ++ std:: ifstream + getline () ( 161 505, 160 876 )
2. Display of ALL file in storage means Qt: QFile + QByteArray + QBuffer + Buffer.readLine () ( 273 488, 296 010 )
3. Map - a file in storage means Qt: QFile + File.map () ( 157 117, 159 959 )
The one-continuous test yields the same results (but I initially am guided by multi-threaded operation).
At me a classical hard disk (not SSD) with the buffer 64 MB and read-ahead support. Whether there will be a gain of productivity from SSD a disk if to consider, what on section it is a lot of empty seat so files not and therefore operation goes through the disk buffer?
What else there are artful methods of fast reading of the big files?