DEV: ColdFusion/Lucee, Amazon Web Services, BitBucket, and Deployment Scripting

in #coldfusion6 years ago (edited)

pexels-photo-546819.jpeg

In the process of setting up a fresh new web application project this week, I decided to give Amazon Web Services another try. I had dabbled in it before, but it's a pretty hefty service, and at the time I just didn't have the extra resources and energy to dig in too far.

If you aren't familiar, AWS is (in part) a "web hosting" service. I say that with quotes because it's actually WAY more than I can get into here. But for the purposes of this article, AWS in this context is simply a cloud-based server virtualization platform.

My goal was simple: Take the bits of the new application I had already completed locally (really just the UI wireframe of the app, at this point), and get it out onto a server where the client can view it, and at the same time implement source control via bitbucket so that I can make changes and push them out to a central repository... and, since I'm dreaming, make it so that if I take a repo push a certain way, it will trigger the deployment scripts and push the app out to the server where the client can see the updates.

Now, I already run my own hosting service, with VPSs from a very well known ColdFusion-knowledgable datacenter. But I've been considering all of the benefits that AWS has to offer, and the potential to spend a lot less every month for the pleasure. AWS has been on a lot of job offers lately - experience with it seems to be expected more than ever these days. I recently came off a project where AWS was in use, but because I was not a designated server admin, I was not exposed to the service directly. But the means by which they were putting code into source control and doing deployments was intruiging and I have wanted to do it myself on my projects for a while. I have one other client where I set it up, but using my own servers.

So with this project, i started by setting up my AWS account, and discovered that if I use one of their smallest EC2 linux-based server instances, it was FREE for an entire YEAR. Up to 30GB of storage, unlimited bandwidth. Small amount of memory and CPU, but more than sufficient for what amounts to a demonstration of the front end of a web app with no data or real processing going on. (For the record, the app is a ColdFusion/Lucee app, with CFWheels as the underying framework).

My first attempt, I spend a day learning all about it and experimenting with an instance I created, based on RHEL. Within 5 minutes I had a server up and running and responding to web requests. But my attempts to configure Apache and Lucee, and getting an SSL certificate running, were a failure. I was fine with it; this was, after all, my very first attempt. The overall result was, "I definitely want to learn more!" Mainly because the admin interface that AWS offers is pretty stellar.

So that evening, after pretty much screwing up most of the config files, I terminated the machine and it disappeared from my account.

Now, I had originally used an Amazon AMI to set up the RHEL machine instance. The AMI was offered directly by Amazon, so I felt I could trust it. And, it was free. But there are zillions of AMIs, both from Amazon and from third parties, both free and for pay. So I set out looking for an AMI that was already configured with Lucee on it, to jump start my progress for round two.

Sure enough, I found one, and not only was it the latest version of Lucee, but it was also running on the same server OS that my current servers are using: CentOS 7. Score! I created an image from that AMI and fired it up. It took some time to poke around and see how they had configured Apache and Lucee, and they didn't leave much as far as instructions go. But in about an hour, I was able to set up a new website on the machine.

When I had it working as far as I needed, I took a "snapshop" of it, using the AWS admin interface. I did this so that in the worst case scenario, I could roll back to a known-good starting point. This would turn out to be a good move, because I once again tried to set up an SSL certificate, this time using "Let's Encrypt", who provides a set of scripts to sorta automate the creation and installation of a cert. But it failed, AND it broke my install, so that the website that was working no longer did. Rolled back to the snapshot in a matter of about 3 minutes and picked up where I left off!

I opted to use CloudFlare as a frontend proxy, and they offer SSL certs at no cost, so I ended up going that route for now... because I wanted to move on to source control and webhooks.

I had already pushed my project up from my local machine to a bitbucket repo, so all i needed to do was pull the current master branch from the repo, into the web directory of my new AWS server. After playing with some SSH keys, it worked just fine.

The next step was to take some existing deployment scripts that i had copied into my web app, and modify them so that when I pushed code up to the repo from my dev machine, it would automatically trigger a call to a URL on the AWS server, which would run the GIT script that would pull the latest branch and update the code that was in production.

After modifying my scripts, I made a code change, pushed it up to the repo, and moments later i got the email that i configured on my live AWS website, indicating that the deployment script had executed.

Next, I decided that since I was already in this area of the code, I would see if I could fine-tune the deployment, so that NOT EVERY PUSH to the repo would trigger a deployment to production. This is important because by default, bitbucket runs the webhooks no matter WHAT branch you are pushing to... so if you have a master branch that always represents your production environment, you might have a development branch... and based off that, you might have feature branches, release branches, bug fix branches, and more... by multiple people. Pushing to THOSE branches also runs the webhook, and those pushes certainly shouldn't cause the production enviornment to be updated.

So, git "tagging" to the rescue. My goal became "Since I know the webhook is going to call my deployment scripts ALL the time, I need to use the additional payload that bitbucket sends WITH those calls, and look for a specific tag before I actually call the script that pulls from the master branch to production."

I settled on a convention: First, you develop your code, do your commits, and push up to the origin master branch as you see fit. Nothing happens on the production server. Then, when all your commits are in (and if you're in a team, their commits are in too), and they are all tested and working, you decide that you now have a "release" so you increment your application's patch or minor point version, and the release notes that go with it, and commit. Then, you tag that commit with "release_vX.X.X" and push it to master. BitBucket runs the webhook as it always does with each push, and this time, the tag is included in the payload. The deployment scripts on the server check the payload, and since there's a tag with a name that starts with "release_", THAT triggers the actual deployment of the code from BitBucket into the production server.

This could be further modified, so that more scripts are run to update a database schema, etc.

This is a great way to work, EVEN IF YOU ARE THE SOLE DEVELOPER.

As far as the different ways you can work with GIT and BitBucket (or github or whatever), that's outside the scope of this post. For the record, I've chosen the "OneFlow" methodology for what I'm doing. More information here:

http://endoflineblog.com/oneflow-a-git-branching-model-and-workflow

But if you are opting to stick with your own schema or the ubiquitous "GitFlow" methods, nothing changes... you still want to limit actual deployments to production for when YOU are ready to have them happen, monitor their results, and test.

This works for big or small teams, and doesn't require any other third party apps like Jenkins, so it's a fast startup with a smaller learning curve.

I am providing a sample webhook script below; you can tear it apart and modify it to your heart's content. Ask any questions you like, I'll help when I'm around!


< !--- add to your bitbucket repository's PUSH webhooks as: http://www.example.com/bitbucket-webhook.cfm?action=deployProduction --- >

<cfset strJson = "">
<cfset strName = "">
<cfset html = "">
<cfset boolDeployToProduction = "false">

<cftry>

    <cfset strJson = toString( getHTTPRequestData().content )>
    <cfset strJson = ReplaceNoCase( strJson, "payload=", "", "ALL" )>
    <cfset strJson = URLDecode( strJson )>
    <cfset objJson = DeserializeJSON( strJson )>

    <cftry>
        <cfset strName = objJson.push.changes[1].new.name>
        <cfcatch></cfcatch>
    </cftry>

    <cfsavecontent variable="html">     
        <cfdump var="#strName#" expand="true">
        <cfdump var="#objJson#" expand="false">
    </cfsavecontent>    

    <cfcatch>

        <cfsavecontent variable="html">         
            <cfdump var="#cfcatch#">
        </cfsavecontent>        

    </cfcatch>

</cftry>


<cffile action = "write"
        file = "#GetDirectoryFromPath(GetCurrentTemplatePath())#webhook_result.htm"
        output = "#html#">

<cfparam name="url.action" default="null">

<cfif ListFirst(strName, "_") IS "release" AND url.action IS "deployProduction">
    <cfset boolDeployToProduction = true>
</cfif>


<cfif boolDeployToProduction>
    <p>Calling production deployment script for release: #ListLast(newName,"_")#</p>
    < !--- INSERT YOUR CODE HERE THAT RUNS GIT SCRIPTS THAT WILL PULL FROM YOUR MASTER BRANCH INTO YOUR PRODUCTION SERVER DIRECTORY --- >
</cfif>

<cfoutput>  
    <p>Requested actions (#url.action#) Complete</p>
</cfoutput>

Sort:  

Congratulations @oranuf! You have received a personal award!

1 Year on Steemit
Click on the badge to view your Board of Honor.

Do not miss the last post from @steemitboard!


Participate in the SteemitBoard World Cup Contest!
Collect World Cup badges and win free SBD
Support the Gold Sponsors of the contest: @good-karma and @lukestokes


Do you like SteemitBoard's project? Then Vote for its witness and get one more award!

Congratulations @oranuf! You received a personal award!

Happy Birthday! - You are on the Steem blockchain for 2 years!

You can view your badges on your Steem Board and compare to others on the Steem Ranking

Do not miss the last post from @steemitboard:

The Steem community has lost an epic member! Farewell @woflhart!
Vote for @Steemitboard as a witness to get one more award and increased upvotes!