Image courtesy : truelab
Now, if you want to know the benefits of these two there are tons of articles and videos available online. You can also check the article on truelab about Symfony and AngularJS: the perfect match.
So our company decided to migrate their desktop application into a full fledged AngularJs application powered by Symfony2 in the backend for REST apis. Everything worked really fine and the application was a big hit until the records which we were showing on the screen crossed 1000 and the application stopped loading. So this post is all about the optimizations which we did to make this app shine again. Following is the list of problems and their solutions:
Ajax calls stacking up :
Asynchronous Ajax calls are the beauty under the hood which makes you app look and perform like a desktop app and as the name suggests they are asynchronous calls and when we investigated we found that they all were stacking up. Let me explain so on a screen we were making three calls to bring three huge lists in the background. In an ideal scenario these calls should bring the data independently and the user can see the information. However the calls were stacking up and data was received one after another. For really small lists this problem was not surfaced but when the size of list become 1000 items the delay reached to minutes.
Symfony locks the session during the execution of a request to prevent session corruption by other requests. This is a nice feature but if your requests takes a lot of time then all the subsequent requests are just waiting to get hold of session. So we added following code in our controllers to free the session.
Huge content returned from server:
As the number of items increased the size of content reached to 2MB. This was huge and it was consuming a lot of bandwidth and we realised that we missed the basic performance step
yes, we missed it. Content encoding (Gzip) is a strong feature and most of the modern browsers supports it. After realising this we made sure apache module “mod_deflate” is enabled and added following to the .htaccess file
This reduced the size to response to few KBs (compressed).
Along with the above improvements, there are few more things which you can do:
- use symfony profiler and see the sql queries being made during your call. Our entities were interlinked so while generating the JSON response, Doctrine was lazy loading all the connected entities and there were 3 queries per row. Imagine the number of queries being made for 1000 records (3000 database queries). So we decided to use DQL to execute custom join queries and added columns which we want to select in the query.
- Always try to normalise your tables and avoid joins if you can. We replicated content of one table (which had just two columns) into the main table. This reduced one join and gave us huge performance benefit while fetching long list.
- Make sure proper index are created on the columns in your table and always explain analyse your queries to see if indexes are used.
After all these improvements out server calls become a lot quicker and they were now delivering huge amount of data. Now that we should be happy but we were not. This created a new problem. Browser takes too much time to render the list and almost freeze and we were refreshing these lists every 20 second. So we have a screen full of data but you can not click on anything.
Beware of ng-repeat:
ng-repeat is a very useful directive and make your life easy but it is not at all “Big List” friendly. We researched and actually peeped into ng-repeat directive code to see if we can do something. The interesting thing we found that ng-repeat creates an index for every row (to have unique reference) and it try to recreate it every time we refresh the list.
So we returned unique id for every item in our list from the server and changed the ng-repeat directive to use the id as index by adding “track by“. This reduced the rendering to few milliseconds.
There are a lot of other things which you can do to improve your web app like caching the static data, serving the js files from a Content Delivery Network etc but we have implemented the already.
So after all these changes now the web app is easily handling 2000 items (imagine that).