Marcos Placona Blog

Programming, technology and the taming of the web.

Category: General Techie Stuff (page 1 of 5)

All of the general techie talk that will not fit on any of the specific categories

How to secure Apache with Let’s Encrypt and CloudFlare on Centos

Reading time: 4 – 6 minutes

I took it upon myself to converting a couple of my domains to use Let’s Encrypt in order to offer a secure connection to them. If you haven’t heard about Let’s Encrypt by now you’ have probably been living under a rock. If that’s the case though, have a read at this page and you’ll get up to speed with it.

Their getting started page describes the entire process of installation, but that didn’t really resonate with me. Upon some googling I found a great Digital Ocean article which made a lot more sense to me. That is an absolutely fine tutorial if you’re not using CloudFlare. If you came to this article from a Google search though, chances are you’re also using CloudFlare and are having issues like some of the following:

  • Failed authorization procedure
  • The following ‘urn:acme:error:unauthorized’ errors were reported by the server
  • urn:acme:error:unauthorized :: The client lacks sufficient authorization ::

Hopefully this article will show you how to get that nice green padlock showing on your website. Props to the article on Cloudflare’s support page that took me halfway the process.

Install the dependencies

I usually SSH to my server to get these things done, but this step may vary if you access your server differently.

On your terminal start by installing EPEL (Extra Packages for Enterprise Linux) repository:

Then install GIT. We will use it to get the latest version of the Let’s Encrypt Client.

Download and install Let’s Encrypt Client

Start off by cloning the repository and then saving it to /opt/letsencrypt. Feel free to save it elsewhere but /opt is a good location for third party packages.

Generate a new SSL certificate

This is where we go differently from the Digital Ocean article as we will generate our SSL certificate using the webroot option.

We’ve used the following flags for this setup.

  • –webroot-path is the directory on your server where your site is located. This is not your webserver’s root directory but your website’s
  • –renew-by-default selects renewal by default when domains are a superset of a previously attained cert
  • –email is the email used for registration and recovery contact.
  • –text displays text output
  • –agree-tos agrees to Let’s Encrypt’s Subscriber Agreement
  • -d specifies hostnames to add to the SAN. You can specify as many domains and subdomains as you need here as shown above

After you run that you should get a message saying your certificate chain has been saved.


Apparently I also need to read about upgrading Python on Centos without breaking everything

Setting up the SSL certificate with Apache

With your certificate created it’s time to tell Apache that you want it to use that. On terminal run:

And you should get a screen that looks like this:


Apache still doesn’t know about this new certificate but we’re about to change that by selecting option 1 and on the subsequent screen choosing whether we want to make HTTP required or optional. I chose Secure here as I want all of my requests to be redirected to HTTPS.

You should then end up with a confirmation screen that tells you to check that your certificates are correctly installed. This procedure will have modified your httpd.conf file to add redirects so all requests that are non HTTPS are now redirected to be HTTPS.


Go ahead and hit those URL’s and you should see that they both get a grade A pass.


Updating Cloudflare

We need to tell CloudFlare that we now have an SSL certificate and want the communication to our website to use it. On CloudFlare’s dashboard for your chosen website choose Crypto and under SSL choose Full (Strict). You will probably want to use Flexible here if during the previous step you chose HTTPS to be optional.


At this point you should be done and your website should be showing a nice green padlock on the URL bar.


You’re using WordPress. In this case you will also want to update it so the URL is always HTTPS. You can do that by going into WordPress Admin, and then navigating to Settings > General.


And that will make sure every image and every URL on your WordPress site is served via HTTPS.

I’m joining Twilio

Reading time: 2 – 2 minutes

So the time has come for me to move on and accept a new challenge. As of the October 20th,  I’ll be joining Twilio as a Developer Evangelist.

Twilio has been no stranger to me for quite a while now,  and when I saw an open position in their devangelism team,  I wasted no time and applied for it.

The whole process took a bit of time, but both Twilio and I were quite keen to make sure we would be a good fit for each other.

I had a number of telephone interviews,  and then flew over to Twilio HQ 3.0 for the final round of interviews. The process was quite thrilling,  and I got so excited about it, this was the only company I actually applied for. Halfway the interview process, and after having met and spoken to some of the most clever people I have ever spoken with, I knew I wanted nothing but to work  with them.

I am supper excited with the prospect of not only writing  lot of code and working with some amazingly clever people, but also helping other developers writing some kick a$$ code in conferences, meetups, hackatons, stackoverflow…. or the pub.

Also, I will loudly and proudly wear my Twilio jacket to make sure people know they can approach me to have a chat about any Twilio integration, development in general, or life if they fancy it.

Here’s an example of what I will be doing.

So keep tuned, and get in touch!

UK Top 40 albums & singles JSON

Reading time: 1 – 2 minutes


So I had this idea for a little application and wanted to get the UK’s Top40 singles to use in it. I started by writing something that would scrape Radio 1’s Top 40 chart and return me a list of songs since I couldn’t find any feeds that would give me that.

I then thought this could be of use to somebody else, so turned it into a little service (built using Ruby and Sinatra) that returns a JSON object with all the singles and some useful information about the number of weeks it’s been there, how much has it moved, and which direction (up or down) it’s gone.

On the root of it, it returns the chart date as the current date and time, and the retrieval date to indicate which date it’s been last retrieved. I am caching the feed to play nice with Radio 1, so I’m only making one HTTP request a day to their website.

Check it out at

Also feel free to fork it, and collaborate by adding your country’s top 40

My NodeJS app development experience

Reading time: 3 – 5 minutes

So, I decided to take a punt at writing a NodeJS application. Not a big application, or anything that would get me slashdoted, but an application that would help me understand the language, and give me a taste of what it is so many people are talking a bout. I have worked with JavaScript for quite a while now, and NodeJS seems like one of those “just right” languages, that you pick up in an afternoon and come to love it after writing a few lines of code. I read a few tutorials on the subject. And then decided to dive straight into building a small application that uses GitHub’s API. There were three things I wanted to take out of it:

Installing everything was painless, but I have to admit getting NodeJS to build on a Raspberry PI wasn’t the most straight forward thing I’ve ever done. Not because it wouldn’t work, but because building it from source took forever. Just to clarify, I didn’t need to run it on a Raspberry PI, but just wanted to prove myself I could. This mini project had 5 elements to it:

Three from which you don’t even need to install (as they are npm modules). Just by adding them as dependencies on the package.js file, they get installed into your project, so deploying is made easier (in theory), as you don’t actually need to install anything other than NodeJS.

Now you might be asking yourself why I say “in theory” when taking about deployment. The fact is that deploying a NodeJS application is a pain, since all you end up doing is making [your preferred webserver here] proxy all the calls to your local application. Why? you may be asking…

The fact that each NodeJS application you create establishes its own server, means anything will clash with your already installed webserver, since you can only have one webserver per TCP port. So if you want to host all your applications as well as any new node project you come up with, then you will end up taking this route. It’s not terrible, but just means you now need to support yet another thing. There are tools out there to help you with such thing, but let’s not digress.

Building the application was fairly painless, and using Jade was the big highlight for me. For a while now I had looked into templating engines, and really wanted to delve into it a little more. I had previously looked at jQuery Templates, Ext and Dust.js, but Jade is totally different, in a way that actually pleases the eyes (except when it doesn’t :-)).

Did I become a NodeJs expert? No way!

Will I keep looking and building sexy applications with it? Heck yeah!

Enough about me though.

Have a look at the application I wrote here, and make sure to send me pull requests if you would like to improve it.

Managing your dotfiles the right way

Reading time: 3 – 4 minutes

It’s no secret that on the UNIX world, dotfiles play a very important part when it comes to making your terminal look good. Be it on Linux, be it on a Mac. Dotfiles are there so you can configure your favourite software to look just the way you like it.

I especially use dotfiles to customize the look on my terminal, and to manage bundles I use with Vim. One thing that normally annoys me, is the fact that whenever I rebuild my machine (or build a new one) I need to copy over my dotfiles, and obviously make sure they are kept up-to-date on all my devices when I change something.

I’ve heard about people adding their dotfiles to GitHub, and even noticed GitHub themselves encourage you to do the same. I decided to give it a go, and will describe here what you need to do in order to have your dotfiles stored there, and most importantly, how to quickly load them up on any other computers boxes you may have.

Start by creating a folder called “dotfiles” on your home directory, and move all your dotfiles into it.

In the example above, I’m only covering my vim and bash dotfiles. You can cover as many as you like by simply moving your files into the dotfiles directory.

Now it’s time to create your install script also under our ~/dotfiles directory. You should use this script every time you want to install your dotfiles on a given machine. So let’s open vim (or your favourite text editor) and create the following file:

The file is pretty simple, as it contains a list of files which you want to have copied, and a loop to put them on the right location and create symlinks on your home directory.

Now, it’s really important that you name this file correctly, as you want to be able to execute it. And after having created it, we need to give it execute permissions, as we want to be able to run it

If you run this file now, you should see the following happening on your screen:

Good. We’re ready to push this to GitHub, so we never have to go through this process again.

Go ahead and create  a new repository on GitHub called dotfiles.

Now, back to your terminal and run the following commands:

From top to bottom we’ve done the following:

  1. navigated to our newly created directory
  2. turned it into a local git repository
  3. added all the files it contained into our local Git repository
  4. committed all the added files into the new local repository
  5. hooked up our local git repository with the GitHub repository
  6. pushed our files to the remote (GitHub) repository

If everything went OK, you should now be able to browse your remote repository (mine is and see all your dotfiles (as well as your installer) listed there.

Now, the beauty of it, is the fact that whenever you want your dotfiles in any other box, you can simply do the following:

All your settings will be copied and applied.

As previously mentioned, my dotfiles are in, so fell free to fork it, modify and maybe even collaborate.

Older posts