Tuesday, June 16, 2015

Installing PhantomJS as a Windows Service (BONUS: Using PhantomJS with ColdFusion)

If you aren't familiar with PhantomJS, you probably wouldn't be looking at the blog post. Regardless, "PhantomJS is a headless WebKit scriptable with a JavaScript API. It has fast and native support for various web standards: DOM handling, CSS selector, JSON, Canvas, and SVG." available at http://phantomjs.org/.

PhantomJS is a very powerful and versatile tool. Once you get comfortable with PhantomJS you may want to install your PhantomJS application as a Windows service. There are a few good reasons for this, but one is running a persistent PhantomJS web server.

If you are interfacing with PhantomJS from another application, it is simple enough to create a new instance of PhantomJS each time you need to use it (i.e. via <cfexecute> in ColdFusion). The problem is that this will result in multiple PhantomJS processes, one for each instance, which leads to unnecessary system overhead. Alternatively, you may want to keep a persistent instance of PhantomJS running as a Windows service to handle multiple requests from your application.

For this to be practical, the persistent instance of PhantomJS has to be able to accept requests and provide responses to your application. In my case, I created a PhantomJS web server application to achieve this. It listens to HTTP requests on an obscure port only available internally. The PhantomJS website has some helpful examples for setting such a web server up. From there, my ColdFusion application simply needs to interface with my PhantomJS server via HTTP.

None of this is helpful unless the PhantomJS web server is up and running. By registering the PhantomJS application as a Windows service, we can ensure it is up and running at all times. We can also stop it, start it, and restart it as needed.

Before proceeding, note that this process requires changes to your registry. I hold no responsibility for changes you make. All changes are your own responsibility. I have successfully used this process on a both a Windows 7 64bit install and a Windows Server 2008 R2 64bit install.

Ensure you can run your PhantomJS application from command line without failure before turning it into a Windows service to begin with. For example, from command line: C:\apps\your-application\bin\phantomjs.exe phantom.your-script.js <port-number>. Where <port-number> represents the port number your phantom.your-script.js is setup to accept as an argument

Download necessary tools from: http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9d467a69-57ff-4ae7-96ee-b18c4790cffd&displaylang=en
  • I suggest installing to C:\apps\rktools\
  • We will need these exe files:
    • C:\apps\rktools\srvany.exe
    • C:\apps\rktools\instsrv.exe

Run at CMD: C:\apps\rktools\INSTSRV.EXE "Phantom JS" C:\apps\rktools\SRVANY.EXE
  • "PhantomJS" is an arbitrary service name. Call it whatever you like. You can use spaces here, but maintain the quotes.

Open regedit.exe
  • Navigate to HKEY_LOCAL_MACHINE > SYSTEM > CurrentControlSet > Services > "Phantom JS" (your service name)
  • Right-click on the service name > New > Key
    • Name the key "Parameters"
  • Right-click on the Parameters key > New > String Value
    • Name the String value "Application"
    • Right-click "Application" > Modify
      • Set the value to: "C:\apps\your-application\bin\phantomjs.exe" phantom.your-script.js <port-number>
        • Make sure the path to the exe is quoted and the arguments are not. This should essentially be the same string you can pass to the CMD line to run your PhantomJS application but with quotes around the exe path. Again, <port-number> represents the port number your phantom.your-script.js is setup to accept as an argument
  • Right-click on the Parameters key > New > String Value
    • Name the String value "AppDirectory"
    • Right-click "AppDirectory" > Modify
      • Set the value to: C:\apps\your-application\bin\
        • This is the path to your application folder containing the PhantomJS exe and your PhantomJS script.

Your entries should look something like this:



Open services.msc and you should see your new windows service present. It is set to auto-start by default, but go ahead and start it for the first time to ensure it runs without failure.




Why "Googling Yourself" is not an excuse to ignore keyword rank tracking for SEO

I recently was asked my opinion on some SEO advice that had been shared with a client. The advice was essentially: Ignore keyword rank tracking because all Google results are personalized and you can't accurately track rankings by "Googling Yourself". Instead, look to organic traffic in Google Analytics as the sole indicator of SEO success.

First off, "Googling Yourself" seems like a bit of a misnomer for the act of checking Google ranks by Google searching a keyword phrase yourself. I think of Googling my name when I read that but it is evident what they mean when read in context. Ironically, one of the biggest issues I have with this has to do with context itself... read on to see what I mean.

There are a few things right about this:

  1. Keyword rank tracking can be considered unreliable, since search results are personalized.
  2. "Googling Yourself" can provide misleading information on keyword rankings since your results are personalized.
  3. Organic traffic in Analytics is a great indicator of overall SEO success.


Beyond those, this advice falls very short of effective SEO but there are ways to reliably check keyword rankings.

While "Googling Yourself" is not a reliable way to check rankings and Google results are indeed personalized, it doesn't mean keyword rankings don't matter for SEO. This is why we built our own in-house keyword rank checking tool that uses Google's Search API to determine ranks. The Search API doesn't factor any personalization in, but it also isn't an exact mirror of the Google.com search engine. Even so, we believe it to provide an authoritative baseline and the ranks returned are the most reliable ranks we can get.

Keyword rankings and activity are important.

Knowing how users found your website, specific what keyword terms were used, is very beneficial. This lets you know what is working, what isn't working, and sheds light on user perception and intent. In the past, Google Analytics reported on the keywords that people used to reach your website. This has since been all but stopped by Google (which is an entirely separate topic). However, you can get some referral keyword insight through Google Webmaster Tools. It isn't ideal, but it does offer some insight.

That said, knowing what keyword searches led to visits is a passive exercise. Keyword research and targeting is an active exercise. Why discount this just because it is difficult to asses rank? Why accept that overall organic traffic patterns are enough to indicate SEO success?

Being pro-active about SEO is better than watching organic traffic numbers passively.

It is true that the best overall indicator of how well you are doing on SEO is organic traffic. That is the end goal in most every case: more traffic from the search engines, period. The more organic traffic, the better you are doing, but how does this help you to be pro-active about SEO? It simply doesn't. It doesn't even help you to be reactive, because you have no insights into why your organic traffic is what it is. In this scenario, you simply implement some best practices (hopefully), hope for Google's favor, and anxiously wait for your organic traffic to increase. Google is smart enough that this is actually still pretty effective, but it is missing a huge piece of the equation... keyword ranking tracking.

Without keyword research and tracking, you can't know what keywords are competitive or low-hanging fruit, effective or ineffective. With keyword research and tracking, you can come up with a plan to target specific phrases. This is where you can get an edge on rankings and pro-actively increase your organic traffic. Google wants to provide the best results to its users, but it needs the help of websites to make that possible. If your website cooperates by offering up clean and clear indications of context, you are helping Google identify what your website is all about. You are also laser-focusing your content toward that context, which helps build authority in Google's eyes. You can partly accomplish this laser-focus through a natural but focused application of keyword phrases in all the right places (headings, lists, titles, descriptions, etc). Do not mistake this for keyword stuffing, but approach it as naturally editing your copy to provide consistency and focus.

In summary:

Search rankings should not be discounted or ignored. Checking ranks on specific search terms (via a tool like ours) isn't going to provide an accurate view of your overall organic search performance, only a sub-set of specific terms you determine worthy of tracking. However, those specific keyword rankings do effect the personalized search results everyone sees. If you rank high on highly relevant keyword searches, your chances of ranking high in related personalized results is much higher. Rankings also offer invaluable insight into how well Google associates you with what you believe the context of your website to be. If you are going after a specific niche, targeting specific keyword phrases and monitoring their performance is a huge part of reaching that niche. It all helps Google understand context (who you are) and authority (what you know or have to offer).

Client example:

Passive (Keyword targeting strategy NOT in place): 1035 organic visitors (12/2014)
Pro-Active (Keyword targeting strategy in place): 1723 organic visitors. (2/2015)
The December sample, shows how the website was doing on organic traffic without keyword research and targeting, but the February sample shows how the website was doing after keyword research and targeting was in place for only a few weeks.

There is too much potential benefit to ignore keyword rankings simply because Google personalizes results.