Slowness found when base 64 image select and encode from database

As a rule of thumb, don’t save files in the database.

What does the mysql manual have to say about it?
http://dev.mysql.com/doc/refman/5.7/en/miscellaneous-optimization-tips.html

With Web servers, store images and other binary assets as files, with
the path name stored in the database rather than the file itself. Most
Web servers are better at caching files than database contents, so
using files is generally faster. (Although you must handle backups and
storage issues yourself in this case.)

Don’t save base4 encoded files in a database at all

Works fine, but take so much time that i expected. Hence, image are
33% bigger size, and totally looks bulgy.

As you discovered, unwanted overhead in encoding/decoing + extra space used up which means extra data transfer back and forth as well.

As @mike-m has mentioned. Base64 encoding is not a compression method. Why use Base64 encoding is also answered by a link that @mike-m posted What is base 64 encoding used for?.

In short there is nothing to gain and much to loose by base64 encoding images before storing them on the file system be it S3 or otherwise.

What about Gzip or other forms of compression without involving base64. Again the answer is that there is nothing to gain and much to lose. For example I just gzipped a 1941980 JPEG image and saved 4000 bytes that’s 0.2% saving.

The reason is that images are already in compressed formats. They cannot be compressed any further.

When you store images without compression they can be delivered directly to browsers and other clients and they can be cached. If they are compressed (or base64 encoded) they need to be decompressed by your app.

Modern browsers are able to display base64 images embedded to the HTML but then they cannot be cached and the data is about 30% larger than it needs to be.

Is this an exception to the norm?

User can post there data and image and all are secure.

I presume that you mean a user can download images that belong to him or shared with him. This can be easily achieved by savings the files off the webspace in the file system and saving only the path in the database. Then the file is sent to the client (after doing the required checks) with fpassthru

What about when I grow to a 100000 users

How they take care about images file. In performance issue, when large
user involved, it seams to me, i need 100000 folder for 100000 user
and their sub folder. When large amount of user browse same root
folder, how file system process each unique folder.

Use a CDN or use a file system that’s specially suited for this like BTRFS

Database has good searching facility, good thread safe connection, good session management. Is this scenario changed when large operation involved

Yes Indeed. Use it to the fullest by saving all the information about the file and it’s file path in the database. Then save the file itself in the file system. You get best of both worlds.

Leave a Comment