Optimum file buffer read size?

A 1k buffer size seems a bit small. Generally, there is no “one size fits all” buffer size. You need to set a buffer size that fits the behavior of your algorithm. Now, generally, its not a good idea to have a really huge buffer, but, having one that is too small or not in line with how you process each chunk is not that great either.

If you are simply reading data one chunk after another entirely into memory before processing it, I would use a larger buffer. I would probably use 8k or 16k, but probably not larger.

On the other hand, if you are processing the data in streaming fashion, reading a chunk then processing it before reading the next, smaller buffers might be more useful. Even better, if you are streaming data that has structure, I would change the amount of data read to specifically match the type of data you are reading. For example, if you are reading binary data that contains a 4-character code, a float, and a string, I would read the 4-character code into a 4-byte array, as well as the float. I would read the length of the string, then create a buffer to read the whole chunk of string data at once.

If you are doing streaming data processing, I would look into the BinaryReader and BinaryWriter classes. These allow you to work with binary data very easily, without having to worry much about the data itself. It also allows you to decouple your buffer sized from the actual data you are working with. You could set a 16k buffer on the underlying stream, and read individual data values with the BinaryReader with ease.

Leave a Comment