Make AWS CodeDeploy process for Magento 2 faster
The times, when all activities were arranged by copying changed files by using FTP are happily forgotten and now its time to use different CI/CD systems. In my case - it's Atlassian Bamboo. My current project is set of some e-stores from one customer, based on Magento 2 platform. Infrastructure is covered by Amazon: EC2 instances, RDS Aurora, ElastiCache (Redis), CodeDeploy and Lambda.
Let's dive into the process. Bamboo collects actual branches from Bitbucket repositories (configuration, custom modules) and copying them to the clean Magento 2, then applying patches (Magento's and custom). Next step is to run dependency injection compilation and static content compilation. The final step is building and compressing the artifact which after pushed to the S3 bucket. By clicking the "Deploy button" it will be pushed from S3 Bucket to CodeDeploy agent.
Using AWS CodeDeploy is very handy because while the number of site visitors growing rapidly, the system could add some instances "on-demand". But time to deploy the new version of e-store quite long - it takes 3+ minutes to compete deploy.
I've started to examine CodeDeploy agent which is written on Ruby and have found that something interesting took place, for example, it compresses all the files in its own archive named bundle.tar(but has .zip file headers) and usage of RubyZip library is also has limitations by the number of files - 65536 (but this is not so important in this context) The main problem is "Processing Bamboo artifact files in CodeDeploy" step because Magento has a bunch of files and processing huge amount of files is always a problem. I thought I can compress only Magento files in .tar in order to decompress them strictly during the deploy process and this will be faster than make changes in CodeDeploy. And it works, the speed of deployment growth up and now it's about 30 seconds per instance.