Saturday, April 13, 2013

Javascript: Safely reading a nested property


In some templating frameworks it can be really annoying reading a nested property of a JS object as it can of mean chaining a heap of null checks together....

for instance if I need to access a nested property 'to' in 'email.addresses.to' safely it means having to do something like:

if(email && (adresses = email.addresses)) {
  //print addresses.to
}

This is verbose and annoying. I needed a function that would return the nested value, or simply return an empty string if any property in the chain was null or undefined

i.e.

safeRead(email, 'addresses', 'to');

I also wanted property chains can be as long or short as I'd like.... ie.:

safeRead(my, 'very', 'deeply', 'nested', 'property');

The finished product:


Thursday, April 11, 2013

IE Ajax requests returning 401 Unauthorized in Rails / Sinatra

Here's a quick little nugget of info for any devs experiencing ajax issues in IE....

Firstly, earlier ( <= IE 8) versions of IE cache everything ajax, and it can be a pain to resolve without compromising (breaking through) server side cache.... I wrote an article here about that...

To add another drop to the ocean of pain that is IE, I found that on Windows 7 (and windows 7 only), IE7, IE8 and IE9, all AJAX requests were consistently returning 401 Unauthorized statuses. After much mining through code and system settings etc., a workmate and I discovered that in Windows 7, all ajax requests send an uppercase ACCEPT_LANGUAGE header, whereas regular synchronous requests send a lowercase one.....

This may seem inconsequential, but for those developing a rack based app using rack-protection, this is enough to trip the session-hijacking check, which compares this header with previous requests (https://github.com/rkh/rack-protection/blob/master/lib/rack/protection/session_hijacking.rb#L23) ...

As the case is different the equality check fails, resulting in rack-protection blocking the call and returning 401 Unauthorized.

Not a fun bug.

The solution is to either downcase the header client side for all ajax requests (i.e. $.ajaxSetup), or introduce some custom middleware before rack-protection that downcases the offending header before rack-protection checks it.

Tuesday, March 26, 2013

Backbone.JS and SEO: Google Ajax Crawling Scheme


Most search engines hate client side MVC, but luckily there's a few tools around to get your client side routes indexed.

As most web bots (i.e. Google and others) don't interpret javascript on the fly, they fail to parse   javascript rendered content for indexing. To overcome this, Google (and now Bing) support the 'Google Ajax Crawling Scheme' (https://developers.google.com/webmasters/ajax-crawling/docs/getting-started) - which basically states that IF you want js rendered DOM content to be indexed (i.e. rendering ajax call results), you must be able to:
  1. Trigger a page state (javascript rendering) via the url using hashbangs #! (i.e.http://www.mysite.com/#!my-state), and
  2. Be able to serve a rendered dom snapshot of your site AFTER javascript modification on request.
If using a client side MVC framework like Backbone.js, or simply have a javascript heavy page, and wish to get its various states indexed - you will need to provide this dom snapshotting service server side if you want your web app indexed. Typically this is done using a headless browser (i.e. QT, PhantomJS, Zombie.JS, HtmlUnit).

For those using ruby server side, there's a gem which already handles this called google_ajax_crawler available on rubygems.

gem install google_ajax_crawler


Its used as rack middleware and essentially intercepts a request made by a web bot adhering to the scheme, and scrapes your site server side then delivers the rendered dom back to the requesting bot as a snapshot.

A simple rack app example demonstrating how to configure and use the gem:

Wednesday, December 12, 2012

Sinatra Asset Snack: Coffeescript and SASS compilation for Sinatra

Up until recently most of my RIAs have been built using Backbone.JS and Sinatra, using the Sinatra Assetpack gem to handle asset compilation and pipelining. Unfortunately recently I found some performance issues with Sinatra Assetpack.

Generally speaking its great at managing coffeescript and SASS compilation and minification on the fly, however I was finding that as my codebase grew, each time I fired up a server in development, it was taking way to long to clear its cache, recompile and load a page. This was a real drag when working on UX, as this recompilation time made development slow and clunky. I was also finding that even after warming its asset cache, the serving of these assets via assetpack on development and test environments was really preventing quick page loads and was starting to become annoying.

In response, I wrote a simple gem to slim down the asset serving codebase, and handle runtime compilation of coffeescript and SASS in a faster fashion. Its released on RubyGems:

gem install sinatra-asset-snack

At the moment it handles only SASS and Coffeescript compilation, and allows you to designate script bundling into common files (i.e. application.js). For example:

Minification isn't handled yet, mainly as most sites (should) use g-zip compression anyway which means minification is largely a secondary / unnecessary concern anyway.

Should anyone want to write any additional compilers for other syntaxes feel free! The code can be found at https://github.com/benkitzelman/sinatra-asset-snack

Monday, November 05, 2012

S3FS: Mounting an S3 Bucket in Ubuntu


Recently I had an app on EC2 that needed to manage files in an S3 bucket. Normally I would use a gem to handle the uploading etc., but I figured I would give s3fs a crack.

Essentially s3fs allows you to mount an S3 bucket as an external file system, allowing you to manage files transparently as part of the OS.

It was relatively simple to get it working on an EC2 instance running Ubuntu 10.04.3 LTS

OVERVIEW:

 - install the dependencies
 - install s3fs
 - create the password file
 - add the s3 bucket to fstab
 - mount and symlink to deployment dir


Installing Dependencies:
$ sudo apt-get install build-essential libfuse-dev fuse-utils libcurl4-openssl-dev libxml2-dev mime-support


Install Fuse 2.8.4:
$ wget http://sourceforge.net/projects/fuse/files/fuse-2.X/2.8.4/fuse-2.8.4.tar.gz/download -O fuse-2.8.4.tar.gz
$ tar xvzf fuse-2.8.4.tar.gz
$ cd fuse-2.8.4
$ ./configure --prefix=/usr
$ make
$ sudo make install


Install S3FS:
$ wget http://s3fs.googlecode.com/files/s3fs-1.61.tar.gz
$ tar xvzf s3fs-1.61.tar.gz
$ cd s3fs-1.61/
$ ./configure --prefix=/usr
$ make
$ sudo make install


Create /etc/passwd-s3fs:
$ sudo vim /etc/passwd-s3fs

Populate it with
[your_aws_access_id]:[your_aws_secret]

Change permissions:
$ sudo chmod 640 /etc/passwd-s3fs

Add Mount Point:
$ sudo mkdir /mnt/bucket

Add to FSTAB:
s3fs#[bucket-name] /mnt/bucket fuse netdev,default_acl=public-read,use_cache=/tmp,use_rrs=1,allow_other 0 0

Symlink to Mount Point:
$ ln -s /mnt/bucket /[deployed_app_path]/public

FYI ONCE OFF MOUNT COMMAND CMD:
$ AWSACCESSKEYID=[your_aws_access_id] AWSSECRETACCESSKEY=[your_aws_secret] sudo /usr/bin/s3fs [bucket-name] /mnt/bucket -odefault_acl=public-read

Friday, July 20, 2012

3 Character to 2 Character Country Codes

For anyone else who needs to map 3 character to 2 character country codes in ruby, I have provided the following class based on the ISO_3166-1 country code list found at http://en.wikipedia.org/wiki/ISO_3166-1#Current_codes

Saturday, July 14, 2012

Google-api-client Authorizing with an API Key in Ruby

The documentation for the Google RESTful APIs is generally pretty good, however when playing with the google-api-client ruby gem, developed by Google to trawl their APIs, I ran into a few issues, particularly when authenticating using an api key (rather than OAuth).

After installing the google-api-client gem, getting a Google API Key (https://code.google.com/apis/console/), and setting up a custom search account (with its prefs widened to all web pages - http://www.google.com/cse/)....

The following allowed me to trawl google search results (copy paste into irb, then inspect response when finished):

  require 'openssl'
  OpenSSL::SSL::VERIFY_PEER = OpenSSL::SSL::VERIFY_NONE

  require 'google/api_client'
  client = Google::APIClient.new(:key => 'your-api-key', :authorization => nil)
  search = client.discovered_api('customsearch')

  response = client.execute(
    :api_method => search.cse.list,
    :parameters => {
      'q' => 'the hoff',
      'key' => 'your-api-key',
      'cx' => 'your-custom-search-id'
    }
  )

THE MOST IMPORTANT BIT was the :authorization param when constructing he client.... this ensures the api key is used when calling, in preference to oauth. Without it you will get 401 Unauthorized response status everytime.

Thursday, July 12, 2012

ERROR 1045 (28000): Access denied for user 'root'@'localhost' (using password: YES)

Somehow my root account lost its 'ALL PRIVILEGES' option, and instead was listing all individual privileges when running

show grants;

As root instead of:
+----------------------------------------------------------------------------------------------------------------------------------------+
| Grants for root@localhost                                                                                                              |
+----------------------------------------------------------------------------------------------------------------------------------------+
| GRANT ALL PRIVILEGES ON *.* TO 'root'@'localhost' IDENTIFIED BY PASSWORD '*76854B38A7923CC05E7857229F508E66E89D69AD' WITH GRANT OPTION |
+----------------------------------------------------------------------------------------------------------------------------------------+
1 row in set (0.00 sec)

This was preventing me from assigning all privileges to *.* to other users....
The solution for me (in OSX 10.6.8) was to run in terminal:

mysql_upgrade

which rebuilt my grants table, then after logging into mysql as root, retrying the grant cmd:
GRANT ALL PRIVILEGES ON *.* TO 'myuser'@'localhost'

Monday, June 11, 2012

Sinatra, BackboneJS and CoffeeScript Bootstrap

Just a quick post.... For those who want a quick way to kickstart a ruby / backbonejs app, and maybe do it  in Coffeescript, I added a real basic Bootstrap / project starter to my github account (https://github.com/benkitzelman/sinatra-backbone-bootstrap).

It also integrates Jasmine allowing JS tests to also be written in Coffeescript, which is nice :)

It uses the following tech stack:

GEMS

  • Sinatra
  • Sinatra-assetpack
  • Coffee-Script
  • SASS
  • Jasmine
  • Thin

JS LIBS

  • Modernizr 2.5.3
  • BackboneJS 0.9.2
  • UnderscoreJS 1.3.3
  • JQuery 1.7.2

CSS FRAMEWORK

  • Skeleton 1.1

Tuesday, January 18, 2011

Detect the Adobe Reader Plugin

Recently I had to be able to detect the Adobe Reader Plugin in Javascript. I have included the code I used to flag if the Adobe Reader Plugin is installed, as well as get the version of the current adobe reader plugin. This code will detect:

  • Adobe Reader Plugin for Firefox
  • Adobe Reader Plugin for IE ( <5 and 5+)
  • Adobe Reader Plugin for Chrome and
  • WebKit PDF Reader for Safari
  • The PDF Reader for Chrome (Chrome's default alternative to the Adobe Reader Plugin)
  • Adobe Reader Plugin for most other browsers

To call it in javascript:

var info = getAcrobatInfo();
alert(info.browser+ " " + info.acrobat + " " + info.acrobatVersion);


Wednesday, December 01, 2010

SQLite and Doctrine: in memory databases

Recently for a client's project, I incorporated a SQLite in memory database into my persistence tests as its a good (and fast) alternative to managing separate unit test and system databases. Typically in my test setup / tear down methods, I drop and recreate the unit test database to ensure clean testing conditions.... however I found that Doctrine doesn't support SQLite in memory database drops.

To drop an in memory db - I don't believe the standard sql drop statement works. As this is the case, when calling Doctrine::dropDatabases(), Doctrine's SQLite Driver produces in a 'Database could not be found' Exception in its dropDatabases method.

As the SQLite in memory databases exist for the life of the connection only, a good solution is to simply reset the connection.

Burrowing through the Doctrine framework, within the SQLite Driver (Doctrine/Connection/SQLite.php) I made the following modification:


 public function dropDatabase()
    {
        if ( ! $dsn = $this->getOption('dsn')) {
            throw new Doctrine_Connection_Exception('You must create your Doctrine_Connection by using a valid Doctrine style dsn in order to use the create/drop database functionality');
        }
        
        $info = $this->getManager()->parseDsn($dsn);
        
        //
        // BENS EDIT: If the db is in memory only - simply recreate a new connection
        //
        if(strcasecmp($info["dsn"], "sqlite::memory:") == 0)
        {         
         $c = $this->getManager()->getCurrentConnection();

  $this->getManager()->closeConnection($c);
  $this->getManager()->connection("sqlite::memory:","unit_test");
     
         return;
        }
        //
        // END BENS EDIT
        //

        $this->export->dropDatabase($info['database']);
    }

There are probably better ways to do this - so feel free to leave comments - however the above worked for me...

Thursday, November 25, 2010

CodeIgniter and PHP Howto - Embedding images in Email

DOWNLOAD THE CODEIGNITER EMAIL EXTENSION HERE

I recently had an issue in a client's development which required images to be embedded in html email code (<img src='cid:embeddedImage' />), rather than referenced via a public url (i.e. <img src='http://www.mysite.com/myImage.png' />).





QUICK OVERVIEW - THE PUNCH LINE

I have written a CodeIgniter Library Extension to the basic Email class that facilitates the embedding of images in emails. You can download the code HERE.

To implement it, follow the instructions for Extending Native Libraries on the CodeIgniter website http://codeigniter.com/user_guide/general/creating_libraries.html

Finally, in the body of your email, use the following macro - making sure the class_id attribute matches the one used in the img src attribute (i.e. <img src='cid:my_image' />). An exmaple message body would be as follows


<html>
<body>
<img src="cid:my_image" />
</body>
</html>

// Macro in Windows
{embedded_image file=C:\\my_image.png class_id=my_image}{/embedded_image}

//Alternative Linux Macro
{embedded_image file=/var/my_image.png class_id=my_image}{/embedded_image}


And thats it - the library encodes the image file as a base64 string and embeds it in your email.

PROS AND CONS OF EMBEDDING IMAGES IN EMAIL:

PROS

- Images are immediately displayed when opening a message, rather than the client prompting the user to allow remote content... great for newsletters
- Emails are independent, once downloaded, they don't require a live connection to view in all their glory

CONS

- A little bit more coding and knowledge of email message format is required to get them to work if your framework does not support embedding images in email
- Some conjecture about spam filtering for bulk messages with embedded images. Some believe spam filtering is more pessimistic for emails with embedded images than not when sending to more than 100 addresses

HOW TO EMBED AN IMAGE

I'm not going to go too far into the coding of native php code to email with embedded images (unless I get requests) - so Ill give a brief overview and some references.

Like most things on the web, email message content can be defined in a series of envelopes. These envelopes are defined by content type. You can read about all of them here http://www.freesoft.org/CIE/RFC/1521/15.htm

One of the most useful images to exemplify the content structure of a html email can be found at http://www.phpeveryday.com/articles/PHP-Email-Using-Embedded-Images-in-HTML-Email-P113.html
Bit of a warning though - the article itself has buggy code and examples.

Grossly speaking - your email message if it supports html with embedded images, and a plain text alternative (for non html email clients) should render the following content in the email body



Content-Type: multipart/alternative; boundary="UNIQUE_ID_1"

--UNIQUE_ID_1
Content-Type: text/plain; charset=utf-8
Content-Transfer-Encoding: 8bit

.... plain text alternative content for your html email....

--UNIQUE_ID_1
Content-Type: multipart/related;
 boundary="UNIQUE_ID_2"

--UNIQUE_ID_2
Content-Type: text/html; charset=utf-8
Content-Transfer-Encoding: quoted-printable

--UNIQUE_ID_2
Content-Type: image/jpeg
 name='embedded_image.jpg'
Content-Transfer-Encoding: base64
Content-ID: <class_id_referenced_in_img_src>
Content-Disposition: inline;
 filename='embedded_image.jpg'

(Base64 encoded binary for the image)

--UNIQUE_ID_2-- 

--UNIQUE_ID_1--


NOTE on the above.

-- means open content type. Trailing -- means closing content type. When content type is 'multipart', you can define more than one content type under the same section identifier as you are either specifying alternative content, or related content. multipart/alternative content means one of the specified content sections is used. multipart/related content means that one content section references the other. We use the multipart/alternative content type to specify html, and plain text message alternatives. We use the multipart/related for relating html content with embedded image content.

Monday, November 22, 2010

noscript tag and google index

The noscript tag is indexed by google! I recently released work for a client which used the noscript tag to display the typical 'this site needs javascript' warning if javascript was disabled on the client's browser.... unfortunately I did this in the head of my page AND used a h3 element to title the warning segment....

As a result, most pages on the site were indexed as 'Javascript not enabled' as the title, and the site's metatagged description as the content. Not happy.

Best practice - if you are going to use the noscript tag for simple javascript warnings, use it at the bottom of the rendered page.... many sites do this (including StackOverflow). Might also help not to use any heading elements (i.e. h1,h2,h3) but rely on styling another element as a heading.

Sunday, November 07, 2010

Free DNS Servers: Free DNS Online

For the last few years of web development, I have had to find a couple of reliable free DNS online services to manage my, and client Domains. I thought I'd post up a few and rate my experience with them.

Free DNS Online - My Experiences


Zone Edit : A great free DNS online service, 100% private and 100% free. Accounts used to be able to serve up to 5 domains, however recently this has been restricted to 2. The User Interface / Administration feels a bit clunky and old school web - but pretty straight forward and easy to use.

FreeDNS : Another good free DNS online service, however you entries can be reviewed by other users. Other users can configure subdomains off your domain, however you can pipe this through an authorisation process (where you the administrator are emailed and confirm any config changes). Payment is required if you want to make your listing private.

DNSExit : An uncapped free DNS online service. You can have as many accounts as you require... you can even set up dynamic DNS which is nice. Very slow though - and I have some questions regarding reliability. Recently the service was brought down by a DDOS attack for a couple of days.... have a backup - secondary dns on an alternate name server if possible.

DynDns : Free Dynamic DNS online service. Been around for years - can't recommend it enough for publicly serving from your dev box. Great for limited serving (i.e. demoing sites during development). Have a range of other paid services - however, a bit of googling for these will uncover alternative free services.


Tuesday, November 02, 2010

Detect Popup Blocker: Popup Blocker Detection for Chrome + all browsers

Popup blocker detection for all major browsers including Chrome....





Chrome popup blocker detection is a little different to others in that Chrome returns a valid window object after calling window.open, even with popups being disabled. There are a heap of posts around all basically stating that to determine whether this window object has been blocked by Chrome, we need to test whether the innerHeight of the popup has been set to 0.

Friday, October 15, 2010

Google Analytics - multi domain named site tracking



Another quick blog... Recently I added analytics to a couple of sites with more than one domain name. Its easy to see how much of your traffic is from each hostname (Vistors->Network Properties->Hostnames), but
I wanted to be able to overlay how much traffic was arriving to the site from each domain (to see which one was most effectively being advertised).....

I was able to achieve this by creating a Custom Segment.

When first logging into the Google Analytics Dashboard for the desired site / account, just above the Date Range, there is the Advanced Segments drop down list. Clicking on that drop down I was able to create a new Advanced Segment.

In the Advanced Segment editor, I could expand the Content submenu from the Dimensions menu on the left, and drag the 'HOSTNAME' field into my segment.

I then set the Condition field to 'CONTAINS', then enter the base hostname of the additional domain I want to analyse separately (i.e. myseconddomain.com).

After naming and saving the segment, I can apply it via the Advanced Segments drop down in the dashboard and viola....

Friday, October 08, 2010

CodeIgniter - Supporting Multiple Domains in config["base_url"]

A very quick blog.....

Within the site config file (application_folder/config/config.php file), a base_url property is set. This is read by the base_url() method to generate server side redirections......

If you are creating a site which has more than domain name (ie. www.domain_one.com and www.domain_two.com), its probably a good idea to dynamically create this value in the config file. This way, the domain name is preserved when redirecting between pages.


//...config ...//
$config['base_url'] = 'http';
if (isset($_SERVER["HTTPS"]) && $_SERVER["HTTPS"] == "on") $config['base_url'] .= "s";

$config['base_url'] .= "://";

if ($_SERVER["SERVER_PORT"] != "80") $config['base_url'] .= $_SERVER["SERVER_NAME"].":".$_SERVER["SERVER_PORT"];
else $config['base_url'] .= $_SERVER["SERVER_NAME"];

$config["base_url"]."/";
//... config ...//

Wednesday, October 06, 2010

IE Cache and AJAX: Cache Busting Ajax requests

Yet another 'special case' caveat for Internet Explorer, the red headed step child of the browser family..... (sorry to any red headed step children who might be reading this - chalk it up to the savage injustices of life). I discovered that IE cache and Ajax requests are not the best of friends compared to how other browsers handle Ajax requests.



Recently I found that IE cached ajax requests in a CodeIgniter + ExtJS site. I was using url rewriting, so all GET params were encoded as URI segments...

eg. (http://host/controller/action/param1/param2)

The Problem:
Usually, I would use ExtJS inherent cache busting tools (Ext.Ajax.disableCaching - which normally defaults to true).... but due to the url rewriting, the ExtJS method caused issues. Query string values (?blah=value) are disallowed in my app due to the url rewriting, therefore EXTs native cache disabling does not work as it simply appends a uid to the querystring (?_dc=123453443343). This caused 'disallowed character' exceptions.

Furthermore - I couldn't simply add a random variable to the end of a request, as this could be misinterpreted as an actual parameter for actions with parameters with default values

eg. http://host/controller/action/param1/param2/no_cache01223213312

no_cache01223213312 could be misinterpreted as param3 in the following action:
public function action($param1, $param2, $param3 = "default value")
{
//..//
}


The Solution:
The Big Stick:
Whether you use an MVC framework, URL rewriting, the first thing you should consider is that on all Ajax actions, make sure the header 'pragma' is set to no-cache..... so in php - write the header somewhere before content is returned to the browser

header("Pragma: no-cache");

This can really suck as it blows away all your lovely server side cache introducing a potential performance bottleneck to your app, all because Dwayne Dibley is still browsing your site using IE.

The ExtJS (Javascript) way:
The Ext solution was at the page level to intercept all AJAX requests, and add a random POSTED variable to the parameter listing.


Ext.Ajax.disableCaching = false;
Ext.Ajax.addListener("beforerequest", function (conn, options ){
   if(!options.params) options.params = {};
   options.params.cacheBuster = Ext.id();
}, this);



This forces a server side request as the request is unique (thanks to the random post variable). It also allows me to free specify GET params in the rewritten url, as I am adding a POST variable to uniquify the request.

For generic javascript.... when calling the target url, simply append a generated dummy query string parameter (like a timestamp)

"http://myhost.com/myAjaxPage.php?_nocache=321313213445462"

Again, the same caveat applies to the pragma header - you would probably want to make this cache busting parameter conditional on browser type

Saturday, October 02, 2010

Extra Bucks Online

So in the past I have blogged a bit about finding an eBusiness which is quick to setup and get running. I'm pleased to announce that my latest attempt at this is entering its final proofing.....

ExtraBucks is now online and is in its final proofing stages. The official launch date will come soon.... Essentially it is an odd jobs and piece work bulletin board. Its designed for those who want to make a bit of extra work outside of business hours to link up with people who need the odd job done. All 100% free to use. Check it out - have a play and feel free to leave some comments!

Monday, August 09, 2010

Calling the php interpreter within web script

Recently, part of a development required me to execute shell script as part of a web request. The design was to kick start a long running php process (via shell script), which would run beyond the length of a standard web request. To do this, I used forking shell script and popen... (if using shared hosting, shell execution methods are generally disabled - check php.ini's disable_functions config value to verify).

Within my dev environment (Windows) I was able to execute the following with no issues:

popen("start /b php my_script.php my_args", "w");
Note: 'start /b' forks a windows process - making it run in the background



When executing its equivalent in a Linux environment
popen("php my_script.php my_args &", "w");

In Linux, the server was thrown into a state of confusion, perpetually executing, then aborting the requested shell script command (infinitely starting and exiting the requested php script in the popen command). Obviously explicitly calling the php interpreter from within an executing php process is a particularly nasty thing to do.

I'm assuming this sort of thing doesn't happen in windows as shell script executes in its own command window (which when using proc_open with get_proc_status makes getting a correct PID value problematic). Anyone with futher insight - feel free to leave comments.

To get around this issue, I ensured that in the linux environment, the shebang path to the interpreter appeared on the first line of the executing script, and it was chmod'ed to allow direct cli execution... ie

my_script.php:
#!/usr/local/bin/php

//
// my code
//
?>

CHMOD command used:
chmod 755 my_script.php

Then finally the linux popen command:
popen("./my_script.php my_args &", "w");

The default working directory of popen, and all other shell script execution methods is always web root (so my_script.php sat in the web root dir).

The same issue occurs with all other shell script execution methods:
system, exec, proc_open, passthru, back ticks (`) etc.