Program The rantings of a lunatic Scientist

Posts marked as Web Dev


New Concept Site

L2Program Web Dev

It hardly would have been worth making a RESTful API for the site if I wasn’t going to build multiple interfaces now would it…

Over the last couple of days I have put together a new or alternative front end for L2Program for the purpose of showing additional site designs in my portfolio. Due to how well my RESTful API project turned out I was able to quickly and easily hook up the API to the new front end simply by implementing a new php and javascript interface which added the appropriate HTML to the API data.

This was also an opportunity to improve the API by adding JSONP functionality. JSONP or “Javascript Object Notation – Padding” is a technique used to get around the CrossDomain Resource issue that plagues current AJAX requests. Effectively, AJAX can only occur within the current domain / subdomain. So from my subdomain http://new.l2program.co.uk/I was not even able to AJAX to my api on the main http://l2program.co.ukdomain.

If you think of JSON as being an array of data then JSONP is simply an array of data as the sole parameter of a function.

{colour:”red”,width:100} is JSON

my_callback({colour:”red”,width:100}) is JSONP

One thing you may notice about JSONP is that rather than being some string based object notation JSONP is Javascript! Which is brilliant because as anyone that has included JQuery from the api.google.com or any other CDN will know, Javascript files can be included Asynchronously from any domain! Internal or External!

So in my API I simply added a clause into all the unauthenticated GET methods saying, if a GET parameter ‘callback’ was sent along too, then compute the result as normal but pad it in parentheses and prepend the callback value.

Now in my JQuery I can make a Cross Domain GET request to my API on the main L2Program domain using code like this.

Colour Bar recreation in HTML

Javascript Web Dev

Last night whilst waiting for the NASA Curiosity landing, one of my friends was showing me his progress on his new CV / Portfolio Site, at the top of the page he had a banner of repeating coloured rectangles. It looked awesome, and he was even kind enough to link me the image.

It got me to thinking though, why use an image? It’s a simple enough effect, why not do it in HTML? And so at 5am I set to work on CSSDeck to see what I could do.

Before I show you the Code you can see running demos of project on CSSDeck

Here is what I came up with. Very simple HTML markup that allows you to either hard code a couple of elements or run the little JQuery function generate the markup for you for n elements.

It works very nicely and fills the whole width of the screen during resizes, but it is a little bulky for my liking. And essentially all we’re doing is drawing some simple colours… Surely this is a job for HTML Canvas? Right?

And so now, after another session of light development, here is a version using canvas. I couldn’t workout how to draw from the context of a 100% width canvas (I’m new to this technology), so I have split the function in two. A worker which initially calls the function and set up the resize event, and the function itself which does the drawing. The CSS has also been simplified and the colour HEX values moved into the Javascript array.

New URL schema using URL Rewrite

L2Program Web Dev

Just finished up implementing URL Rewrite for the site. Effectively this means that the normal urls such as

l2program.co.uk/?id=42

Can be disguised as

l2program.co.uk/post/42/Friends_dont_let_friends_use_Internet_Explorer

Why is this good? I mean, one is considerably longer than the other. But who on earth really types in a whole URL manually to access a specific page of a website? I personally can only ever remember doing so while typing long and ugly URL’s off of a lab sheet in High School Chemistry class.

Edit: It should also be mentioned that because the post title is optional in the url…

l2program.co.uk/post/42

l2program.co.uk/post/42/

l2program.co.uk/post/42/any_random_string_of_chars_digits_and_underscores

… Will all work. As long as the /post/ prefix is followed by a digit (and a slash if anything else follows) you’re good to go! The title string is only there so that in an anchor tag it is visible. A user hovering over the link of a search engine crawling will see that string be able to see where the link leads more clearly.

While the new “fake” urls may be longer they benefit from being friendlier. If someone hovers over the link and sees the address preview they will have a much better idea of where their browser is about to be directed. Along with this ‘human’ friendliness the new urls also benefit from being more frendly to search engines. By appending a stripped and simplified version of the article title to the end of the url we have packed the url full of key words relevant to the pages content.

The simplest way of implementing URL Rewrite is to modify or create a .htaccess file in the root folder of your site. Here is the .htaccess file I have written.

There are two Rewrite rules in the .htaccess file. Lets have a look at what they are doing by breaking them apart

  • RewriteRule // This keyword tells Apache we want to define a rule
  • ^post/([0-9]+)(:?\/[a-zA-Z0-9_])?(:?\/.)?$ // If the url passed in matches this regex then Apache will swap it with the next parameter…
  • /index.php?id=$1 // The replace string. $1 represents the part of the previous regex’s capture group. In this case it is the post id

Long story short, that rule matches URL’s of the form /post/42/Friends_dont_let_friends_use_Internet_Explorer and turns them into index.php?id=42 . Similarly, due to the second Rewrite Rule, if a URL matches /post/tag/5/Ray_Tracing it will be converted to index.php?filter=tag&value=5 . Notice how the titles in the url, either of a post or tag, do not matter when the url is converted. This because they are purely there for show. For the benefit of the user or search engine that will be looking at the URL. All that matters is the post id or filter and value.

All this happens behind the scenes though. Your users will never know the truth, and you get to sit there smugly while your code works with incredibly simple urls while displaying beautiful friendly ones.

To help give a complete picture, here is the php function I use to turn an articles title string into a string that can be used in a url. It runs two very simple regex’s on the input string and then returns the result. The first regex simply takes every character or group of characters that are neither letters/numbers/spaces/underscores and removes them by replacing them with an empty string. The second regex finishes the job by replacing any space or group of spaces with a single underscore.

Thus Friends don’t let friends use Internet Explorer… becomes Friends_dont_let_friends_use_Internet_Explorer

Reddit Linker: A Chrome Extension for added sanity

Javascript Web Dev

I finally got sick enough of how Reddit opens Title links in the same tab to do something about it. I always took it as a given in Web Design that internal links opened in the same tab unless specified otherwise, and that external links always loaded in a new tab, so as to protect any possible progress the user may have had within your site/page against being lost… Which with Reddit, it is.

With Vanilla Reddit clicking a link opens it in the current tab, and backing up back to Reddit via the ‘back’ button does not take you to the same place in the page you left.

With the Reddit Enhancement Suite plugin, which features continuous scrolling the affect of this misalignment becomes much more frustrating. It is easy to have scrolled 30 virtual pages through Reddit, only to click a link without thinking and backup to find you are back on page 0. And that auto load images has been turned off.

Shift – X btw if anyone doesn’t know; it’s the same as clicking the Show/Hide Images button at the top of the page.

Anyway, I finally got sick of it and in 2 minutes I whipped up this handy little plugin for Google Chrome.

On page load it simply registers an event for whenever the page is altered. When this happens it sets a ‘timeout’ function to run in 100 milliseconds time.

Why in 100ms time? Because the event fires whenever the page is altered we don’t want it to fire for every little thing when lot of stuff happens. By waiting 100ms we are effectively saying…

If the page has changed, possibly multiple times, and now has become stable for 100ms then fix the page…

Here is the code for the plugin so you can see how simple it really is. The only additional thing that I left out of the code here is that in the Packaged Chrome Extension I prepended the Jquery .js file to the beginning of linker.js and then compiled the whole thing with Google’s Closure Compiler. Which is a very neat and powerful Javascript optimizer and minifier.