Effects Of Cached Data Over Time
The following demonstrates the importance of having cached data. For this I’m using Postgres on Linux Mint. This shows a basic workload I put together against a simple order database. The application randomly reads one of the 16 million customers and picks up to 10 random products from the 80 million products and then creates and order table entry and order_detail entry.
The system was mostly idle for the duration of time the cache grew. You will notice when I started using the desktop to surf the net. There begins to be a lot of randomness in the results. Look at the beginning of the graph though. I’m running 10 concurrent threads in java against the database. What starts out is an empty filesystem cache and empty postgres buffer cache.
Out the gate we get a whoppin 332 transactions per minute. It grows steadily over the next hour to 3000 transactions per minute and then rapidly after that hits the 7000 transactions per minute. The reason for this is purely that the cache is being filled and we are getting more and more read hits.
Though the postgres cache empties on an engine recycle the filesystem cache does not empty unless you reboot Linux or explicitly empty it.