TEST View RSS

Just another WordPress site
Hide details



Kenmore Community Relaunch 21 Apr 2014 7:10 AM (11 years ago)

Completion of the Kenmore Community Blog was quite the feat. It was a strenuous project that pushed my skill set and patience to an entirely new level. For the first time in my career, I had to fill the roles of Front-End Developer, Back-End Developer, and at times, Interactive Designer. It is difficult to put a start and end date on the project, but it lasted around a month. Building the new version of the Kenmore Community Blog required work with various APIs, a content overhaul, a completely new WordPress theme, multi-site updates, and the creation of our own API’s. When it was all said and done, this project was a huge success and even sparked some more work from the client. Most importantly, it was pivotal in pushing myself as an Interactive Artist.

kenmore_inspiration_screen_final

Project Requirements

As this project kicked-off, the Kenmore Community already existed. At this point in time, the site carried an old slapdash theme that no longer held a consistent design with it’s mother site, kenmore.com. Our team did build the older site (I was the main developer on it some years ago), but as web platforms naturally evolve over time, some children and sibling sites are left behind. This was definitely the case for the Kenmore Community site. The Kenmore Community was an orphan site—one with an old design and a frankensteined code base. It was time for some new surgery to begin.

Deep breath. While we could have re-used the old WordPress (WP) theme that was powering the old site, bootstrapping it to work with the current version of WP seemed like an impossible task all on it’s own. With a half dozen plugins and a comments left by developers who haven’t worked on our team in over 4 years, this site relaunch was beginning to look like a never-ending task. Under that guise, I started development with a new design, and with an almost from-scratch theme. It was time to give this site some much needed TLC. Some team members wondered if this was the right approach, so I had to fight for the fresh start. When it was all said and done, I don’t see how we could have finished the project with what was already there.

Before I get into code and pixels, I should point out some other requirements that were bundled in with the relaunch of the Kenmore Community Blog. One of the big “asks” for this project was this idea of adding the blog’s content on the mother e-commernce site, kenmore.com. In doing so, this new content would have to be relevant to the content on the current product pages of kenmore.com. So, the blog content would be featured on various product category pages on kenmore.com in the form of a widget. Simple right? While this may seem like an easy task, I can assure you it was not. Some old school developers would suggest a simple iFramed solution, which could have worked, but this was not the right direction to take. In order to conceptualize our current solution we need to first take a deeper dive into the problem.

Connecting the Dots – Building an Extremely Complicated Widget

Simply using an iFrame poses a lot of issues and to understand why we need to first examine how these sites are hosted. The blog is hosted on our SP (a top-of-the-line shared site environment) and the kenmore.com is hosted internally by the client. Their tech stack is an full-out enterprise solution and ours is a light-weight LAMP stack. With that said, it was very important to keep the blog’s impact on our servers as low as possible. Optimizing the Front-End and Back-End execution of the widgetized, cross-site content pollination could not be done with an iFrame. With an iFrame users would be hitting our servers directly, regardless if they even interacted with it. This would create a situation of absurdly high traffic on our servers with very little return or engagement. Since these widgets were to be loaded at the bottom of pages, it was safe to assume that a majority of the users visiting kenmore.com would not even see the widget despite having to load it.

Luckily, the web has moved away from iFrames thanks to the rise of RESTful APIs, JSON, and amazing Front-End frameworks. Browsers (Front-End execution) now have the ability to share content cross-domain via JavaScript and JSON (JavaScript Object Notation). We have also seen the rise of Content Distribution Networks (CDN) which help off-load external site resource, like stylesheets, JavaScript files, images, videos, fonts, SVG, etc. As luck would have it, our client was already using a CDN that we already have full access to. At this point, I had I had full access to the blog (we host this), Front-End access to kenmore.com (only limited CMS access), and partial access to the client’s CDN. Now that I have all the pieces at my disposal, it is time to put them together.

Pardon the Interruption

Stop the presses! I forgot to mention a very important part of the project. In order to get relevant blog content on the mother site, the content needed to be smarter. We decided to re-evaluate all the content and in doing so, we built some custom taxonomical hierarchies that matched our the content structure of kenmore.com e-commerce experience. In other words, a blog post about cleaning your oven would be tagged as an product oven post. We did our best to create common relationships so that we could start building large buckets of relevant content. While this seemed like an easy task, it was a long process to read and properly tag and categorize thousands of posts. Unfortunately, much of this needed to happen prior content pollination, but let’s get back to the code story.

The Solution

The end tech solution for this project would be a complicated one. First, I needed a way to create JSON feeds of content from the blog that was based on our new taxonomies. WordPress doesn’t provide this functionality out of the box, so I created a new API that would allow me to use POST or GET variables in order to return JSON data of our post content. This was written as PHP and exists as a simple WordPress page. I had built something like this in the past, so I recycled some old code to serve the purposes of building a fast “get posts” API. Everything worked great, and it was a pretty simple transition for this code to be used on the new site.

Second, I needed a way for admins to curate the content in these new feeds. By default these feeds populate automatically based on our new taxonomies, but I went that extra mile of designing and building a custom admin panel for the client to select posts to add to each of the JSON feeds. This was a little complicated because I had to create logic that added the manually selected posts but also made sure that no duplication occurred, and regardless of duplicate posts, the end result always contained 5 posts per feed. This required me to brush up on PHP object arrays and all the functions that can be used with them. It was actually pretty fun.

Third, I created a CRON job on our server that would execute and save each of these feeds to our clients CDN. The CRON runs every hour but also has a refresh button within the admin panel so all the feeds can be regenerated at any time. At this point in the process, the feeds are now static files and are hosted elsewhere. The content that will be severed to the Front-End user no longer originates from our hosted site, inspiration.kenmore.com. They are also in JSONP format (Cross-Origin Resource Sharing or CORS) so they can be used cross-domain. This was necessary because the CDN didn’t have the same domain as the end-site the widgets will be placed on. With this setup, our previously mentioned traffic issue has been solved.

Fourth, it was time to build the widget and put it on the site. As a Front-End developer I am always conscious of page weight and the opportunities of optimization. Large enterprise sites usually come with large amounts of JavaScript and other external resources that significantly weigh down the page. In order to create a simple codebase to maintain, keep file sizes low, and reduce the total number of external requests, I created a widget that was completely exclusive to a single JavaScript file. No matter what page (or page on a site) you wanted the widget on, you only needed to include one script. This single file contains all the styles and functionality for the Front-End user as well. Therefore, it is completely independent of the site it is placed on. As you can imagine, the widget was requested for other sites managed by the client and it was already built to do so. Big win.

kenmore_inspiration_single_post_final

Additional Requirements

In addition to all of this work, I also needed to interface with the YouTube API in order to bring the client’s video content into the blog as posts. This required the creation of a separate API that pinged YouTube’s API every hour via an automated CRON. To help improve the auto tagging and categorizing of these posts, I also developed a simple set of logic that tries to tag and categorize this automated content based on matched keywords and keyword associations. For example, if the description or title of a video contained the word, “recipe” the API would be smart enough to auto tag the content under “Cooking” which is one of the top-tier categories of the entire site. The end result is a constant flow of new videos being generated onto the blog so the client doesn’t need to manually create the post every time they upload a new YouTube video. The auto-tagging can be improved by simply adding additional keyword associations.

Another requirement was the introduction of a new global nav. kenmore.com is now the umbrella brand site for the community blog (inspraition.kenmore) and Cookmore (cookmore.com). With all of these properties springing up and with more on the way, it was time to build hierarchy between them. The global navigation helps signify to Kenmore’s online users what the brand owns and how all the sites fit together. In addition, this nav will eventually be fully integrated with their Single Sign On (SSO) solution. At the moment, an early version of this functionality is present on kenmore.com. This nav will continue to evolve as we have already planned for future phases or launches.

A Comment on the Design

While working on large projects like these, with senior members, sometimes some assets are purposefully left out of our workflows. This is done when the developer can also design or vice versa. For the blog, a lot of the interactions and designs of the footer, email sign up modal, and the single post page were left to my imagination. In addition, I developed a completely from scratch responsive framework, that uses rem units. The site layout transitions to phone, tablet, small desktop and HD TV widescreens. Images are lazy loaded to improve initial load times, and they were also retrofitted for the new design. The old design didn’t use a common featured image ratio and the new one was asking for a 5:3. Overall, the design is flat, and the content is allowed to take center stage. I think it is a beautiful modern site and it received a lot of exclaim from the client.

The Wrap Up

Needless to say, this was an amazing project to undertake. I think it helped prove the depth of my abilities as a Front-End and Back-End developer. Taking control of some of the design was a big opportunity for me as well. In most cases, designs are usually fleshed out and bow-wrapped in PSD format ahead of time. I much prefer more ambiguity with the design when working so I can have a little bit more free reign on the overall experience. Sometimes, designing responsive sites in the browser can significantly reduce total project hours and speed up the end result. PSD’s have a habit of presenting an unrealistic final outcome of an interactive experience. I’ll admit, I still have lot’s of room for improvement with Back-End Development, Interactive Design, and even Copywriting, but it is still nice to see visual progress in all of these areas.

After this site was completed, the client was ecstatic. It triggered another copy of this site to be made for another one of their properties. Nothing feels more rewarding than having plain ol’ good work speak for itself. Great work that inspires others to want the same level of quality in a product is a huge feather in the cap for me and ARS. I am going to try my best to improve upon this work and keep pushing the bar higher and higher.

Until the next project, keep on coding.

Visit: inspiration.kenmore.com

Add post to Blinklist Add post to Blogmarks Add post to del.icio.us Digg this! Add post to My Web 2.0 Add post to Newsvine Add post to Reddit Add post to Simpy Who's linking to this post?

Kenmore Wall Oven Fit Finder 2 Apr 2014 8:09 AM (11 years ago)

No filter, the Kenmore Wall Oven Fit Finder was a strenuous project. This project required a third party API (not developed by ARS or the client) that was used to create product matches based on model numbers alone. This piece of the experience created a lot of confusion with all teams on the project. The API functionality promised by this team was non-existent when the product started. Unfortunately, this realization only occurred only after development started. With little documentation, we immediately hit roadblocks. It was apparent that the API was not ready for the Front-End execution by the ARS team.

kenmore_fit_my_wall_2_final

User Flow

This experience was made to provide users the ability to either match their current wall oven with a Kenmore model or quickly browse all kenmore models by feature type. As users land on the page, they are given two simple choices; “I’m replacing my current wall oven” or “I’m shopping for
my first wall oven”.

Option 1

Upon clicking the first option, users are asked to plug in the model number of their current wall oven. This action hits the 3rd party API (mentioned above). Again, this API creates the means or logic to make this comparison and produce matched results. These matched results only return model numbers and no additional product data. Because of this, these results are then cross-referenced with a large JSON feed of product data so the products can be accurately reproduced on the screen complete with image, title, shop links, etc.

Option 2

Entering the second option, “I’m shopping for my first wall oven”, users are given a set of filters that sort through all of Kenmore’s Wall Ovens. This UI interaction is powered by MixItUp made by Patrick Kunka of KunkaLabs and Barrel of NY. MixItUp handles all the filtering logic, sorting logic and product card animations. The end result is a slick, and simple help me choose experience based on a few key product features.

kenmore_fit_my_wall_3._final

Additional UI Enhancements

As a user scrolls down the page, past the fit tool, a flurry of product features information appears on the screen. These content areas animate onto the screen with a change in their CSS opacity, visiblity, and translateX() properties. This is triggered when the content appears in the users view port and the start time for these content sections are staggered so that they appear one after the other. This was one of the first times I have tried animating static content blocks in this manner. The goal of this exercise was an attempt to guide the users down the page. Generally, users don’t scroll past the fold and various gimmicks like this can help trigger attention down the page. It is hard to say if this was successful, but none-the-less, it was a nice little enhancement that I will continue to do.

GeebArt.com takes advantage of these kinds of animations during page load. Refresh the page a couple times and take notice on how all the various elements on the page appear. You will notice that almost everything animates to a certain degree to give the feeling of a more humanizing page load.

Add post to Blinklist Add post to Blogmarks Add post to del.icio.us Digg this! Add post to My Web 2.0 Add post to Newsvine Add post to Reddit Add post to Simpy Who's linking to this post?

Recipe for the Perfect Site 25 Feb 2013 5:49 PM (12 years ago)

Over the past few weeks, I found myself with a little more time to research new web technologies.  Most of this research involved javascript libraries, blog posts on design conventions, and some technical blueprints of existing sites.  Recently, I even wrote a post about the Rise of the Front-End Developer which boasts the rise of javascript based MVC frameworks and responsive designs.  While it was a pretty good post, I know now that I needed to know more about the future of site Back-Ends, aswel.  I had to continue my research.

During this time, I kept wondering, “What’s the next big ‘thing’ on the web?”.  Luckily enough, the answer kind of fell into my lap.  My development team at ARS Interactive was also starting to roll out a new project management software called, Trello.  I should be clear, we did not create Trello.  Trello is an extremely lightweight, single page application with amazing system architecture. From the design, to the interaction, Trello looks and feels freaky fast.  It almost seems too perfect.  Of course, this great site did spur a lot of interest on our development team and with that came some back-and-forth discussions.  We all wanted to know, “How is this thing built?”  After, some searching, I came across Trello’s tech stack that would lead to this post, “Recipe for the Perfect Site”.

Trello

In explaining the components of the perfect site, I will explain the site in 3 main sections; the Back-End, the Front-End, and the design.  In each, I will include some over arching philosophies  that should be considered when building any site.  This is a long one, so hold on to your hats.

The Back-End | Recipe for the Perfect Site

  1. Node.js

    Node.js is somewhat similar to a PHP Apache Server. It adds server functions, extensions, or libraries that manage http requests, filesystem operations, database communications, and so on. Essentially, you can write almost any extension you want your server to perform. However, Node.js isn’t meant for super intense server usage (not yet).  On the plus side, it is asynchronous, and single threaded. So, you might be asking, “what does this mean?”  Because Node.js works at the socket level (check out socket.IO), all interaction by multiple users can be seen in real time. This means if I was playing an “HTML5” game that was written in Node, I could play the game in real time with other players all in HTML and Javascript.  Let’s pretend we were both looking at the same Node site.  If I interact with a DOM element on the page, you would see that interaction, too. In a sense, socket-level programming languages make it possible for all users to finally “see” the same web document. Imagine every site working like a Google doc or Google Wave. Now that’s powerful.

    node-banner

    Ruby and Python can perform some of the same tasks as Node, but most of their libraries are not asynchronous. The bigger problem — the two are not written in javascript… which is a globally used, cross-browser supported Front-End language.  Wouldn’t it be amazing to only have to learn one language, javascript (or versions of it) in order to write both the Front and Back-Ends of a site? In addition, the price for Ruby and Python driven apps are going to continue to increase. Currently, they are just too specialized. Node is backed by the “right” language and will end up taking over in the long run. I’m very excited to see were Node will go in the next couple years.

    python

    Check out this great post to learn more about Node.

  2. HAPROXY and Load Balancing

    Load balancing refers to the ability of a server to offload incoming http(s) requests to other servers when it becomes overloaded. When running Node, this becomes vital for heavy traffic sites to run efficiently. Because Node allows for simultaneous, asynchronous data exchange at the socket level, sites built on Node tend to have a much higher number of packet request at a given time. Trello uses HARPOXY which has a horrendous site, but I promise it is worth a quick look.

    Why use HAPROXY? This post on tech.shareaholic boasts HAPROXY’s stability and features.  It seems like a great match for any Node site.

  3. MongoDB

    Let me first clear up a common misconception about MongoDB… Usually it does not replace a database, but rather it is a repository of JSON files (not database entries) that are extracted from a working, pre-existing database. As they call it on Mongo’s site, MongoDB utilizes, “Document-Oriented Storage”. That’s great! …but what does that mean?

    Well, let’s picture something much, much larger than this blog. I’m talking about super-sized sites that have 1,000’s of tables, each with 1,000’s to 1,000,000’s of rows.  These behemoth tables can be expensive (for a server) to maintain. Generally, pages that access data from multiple tables can use lot’s of arbitrary and expensive JOIN’s. For example, data in table A and table B is required to pull the correct data in tables C and D. As you can image, the larger the database, the more complex the JOIN (or SQL), the more inefficient a database request can be.  Place that in tandem with a site that get’s over 100,000 visits a week.  Now we can start running into problems.

    mongoDB

    Most of the time, these types of database calls can become redundant, or already big “knowns”. By that I mean, they generally return the same data for a long period of time. So, what if these types of calls were denormalized? In other words, what if we could save these expensive DB calls as JSON files in a file system? This would entirely remove the expensive JOIN’s (normalization) from the equation. This route would be a much-much cheaper alternative. At a high level, this is the philosophy that drives MongoDB.

    Their are some other benefits to using MongoDB. Reading and writing entires to a database can cause a bottleneck. Usually, databases cannot act on those tasks simultaneously. This allows Document-Oriented Storage to fill in where typical databases cannot. If frequent data requests are saved as a document, millions of people can download the same file and we don’t have to worry about all the common rules of a database. This is extremely liberating for a heavy site. We can now have complete access to our data at all times.  Read and writes never need to compete, everyone is happy.

    Since most new SPA sites like Trello make use of javascript templating engines (or MVC frameworks), they lend themselves to MongoDB Back-Ends. By that I mean, most SPA’s make use of JSON documents to display content. Now we have a strong back-end that can support the perfect Front-End. All the pieces are falling into place… With a SPA site based on Node that allows all of its users to continuously send packets to and from our server, it only makes sense to use MonoDB with a load balancer and a JS MVC framework. This will keep our DB free, and prevent our servers from getting too overwhelmed.  The end result is  a super fast site.

  4. Using a Content Distribution Network (CDN)

    Let’s take the MongoDB solution one step further. Not only will we load all of our typical site resources (CSS, JS, images) in a CDN we can now also load our MongoDB JSON documents, too. CDN’s are like a load balancer on steroids, but only work with specific webpage resources. Essentially, files that are loaded to a CDN are accessible to 100’s of servers around the world. When a user tries attempts to download a file, they are directed to a server closest to them. This allows a site to have a more consistent load time for a globalized audience and also helps keep unneeded stress off of our main servers (which are load balanced).

The Front-End | Recipe for the Perfect Site

  1. Backbone (or Angular)

    Page loads are a thing of the past. Hundreds of single page application (SPA) sites are popping up all other the web. Essentially, they take advantage of HTML5 push states (modern browsers only), that allows you to dynamically change a URL without a real page refresh. A good example of this is SoundCloud that uses Backbone. Angular is another great MVC JS Framework, but Backbone seems to be the most popular on the web right now.  Both have great documentation.  Why would someone want to do this?

    Let’s think about user flow for a minute. When a user goes to a site, they are prompted to download all the resources that are tied to a given page. This includes images, videos, fonts, type, javascript, CSS and so on. Generally, a lot of these resources come in as single external files that usually contain everything they need to run any page on the site. Consider that when a user moves from page-to-page sometimes they have to re-download some of these files. What is even more troubling, mobile devices can’t cache anything that is larger than 25kb. What if we removed the need to reload pages? What if we just switched out content in the body of a site with AJAX and allowed the user to stay in the same resource state? In a nutshell, this is one of the amazing features of MVC javascript frame works. Because they act like a native app, users never have to download redundant content. One load to end them all.

    Screen shot 2013-02-25 at 8.25.23 PM

    These types of sites (SPAs) also take advantage of client-side templating engines like underscore.js or mustache.js that work in tandem with Backbone.js. This system utilizes JSON or JSONP feeds (that would be served by MongoDB) that are built within a repeatable template and are then rendered on the page. Since JSON can be tightly minified, and newer client-side javaScript engines are getting extremely fast, pages built as SPA’s load almost instantly. Again, no more page refreshes.  Users can interact for instant content generation. With backbone.js any site can mimic a native smartphone or desktop app.  These frameworks are changing the way we experience the web.

    With any SPA site, SEO becomes an issue. Remember that search bots don’t run javascript, so you will have to build an alternative method for serving bots. Check out this article on how to get around your SPA SEO woes with Phantom.js

  2. Optimize the Front-End

    This bullet point is probably the most important, so I will begin my Front-End section with it. Too often do I view large sites, test their speed, and laugh at the  bloated amount of JavaScript, the abundant use of large images, the lack of properly saved images, unminified libraries, and the list goes on. When Front-End devs cut corners, it usually shows in overall page weight. I am a huge advocate for getting Front-End load times optimized. It is an absolute must on any modern site. As a developer, you must always think in terms of optimization. How can I show more rich media for less?

    ySlow | Yahoo

    While I could write an entire page dedicated to optimizations, I will create a short list of things you can do to help cut down load times. I recommend researching them yourselves if you don’t already know what they are. Keep in mind that the worst cuprit in bloated load times always tends to be images. Please review you rich media carefully.

    1. Save images properly. Save them for “web and devices” and possibly use additional compression options like Smush.IT or TimThumb.  Don’t save over .jpgs too much.  Compression leads to a loss in quality.
    2. Use a Custom Icon Font (check our IcoMoon) with the goal to remove and replace as many images from the site as posible. Keep in mind that font icons can only be one solid color.
    3. After exhausting a custom font, put all other images into an image sprite when possible.
    4. Use the jQuery LazyLoad Plugin to control image loads that are not visible to the user.
    5. Consider AJAX loading videos and iFrames. iFrames and videos can really drag a site, use them sparingly.  Load them on events instead of initial page load.
    6. Use gZip compression on all page assets (more of a server-sided optimization that improves Front-End performance).
    7. Chose only one form of analytics to load onto the site.  Try not to cram a site with tons of analytics javascript.
    8. Use a javascript MVC framework (when IE doesn’t matter). SPA’s tend to remove most of the “re-requesting” that goes on during a user’s visit to a website.
    9. Use a Content Distribution Network (CDN).  Let your server breath a little bit.  Off load server weight.
    10. For mobile, consider keeping some web resources under 25kb so that they can be cached.
    11. Minify all CSS and JS.
    12. Load CSS in the header and all JS in the footer.

    When considering wether or not to optimize you site, always take your mobile users into consideration. Smartphones and tablets are much slower than desktops. The typical 3G connection is usually under 1.5 MB/s where desktops can exceeded 100 MB/s. As you can already tell, wireless data technology is not quite in the same ring as desktops. With mobile on the rise, it is so important that we optimize every aspect of our site. From Front-to-Back, websites need to be as optimized as possible. The take over of mobile is going to come so quickly, you may consider not developing for < IE 10 on your new projects. Bottom line — you’re mobile user base will soon dwarf that of your desktops, so optimize, optimize, optimize.

    About a year ago, I did make a WordPress Optimization guide. It is more of a Front-End optimization handbook, but for those of you wanting a little more, check it out.

  3. Responsive Layouts

    This practice/philosophy has been solidified across the professional development and design worlds. Every site today needs to be responsive. This means you work in percentage widths, setting only hard pixel max-widths to cap DOM dimensions at certain device resolutions with CSS3 media queries. Sites can become fluid, and images can be stretched or shrunk in perfect proportions.  For images and other rich media, try not to set a height.  Allow this media to expand proportionally with a variable width.  As long as you stick to fluid-like design, a site can adapt to 100’s of screen sizes and devices.

    Currently, geebArt (as of early 2013) is not responsive. I have a new version of my site that is launching in the following weeks. I did, however, uphold this philosophy on cookmore. Feel free to dig around in the code and play with the browser window. You don’t get a true mobile experience until you actually load  the site with a mobile device, but the faux desktop-mobile is about 90% there.

    For those of you who have not played with any responsive frameworks, I recommend defaulting most elements to, “float:left;width:100%;box-sizing:border-box;” and then applying max-width’s with media queries. Box-sizing is a terrific style to start using on most of your elements. It allows you to include padding and border pixels within a specified width. Once you start playing with percentage widths, the advantage of using box-sizing becomes very apparent. Happy coding.

  4. CoffeScript

    I won’t go into many details about CoffeeScript. Essentially, it is a “like” javascript language that compiles to javascript. It allows you to write less (in a creative way) and the output is converted to valid, clean, and minified JS code. Simply put, with CoffeScript you can write less and optimize more.

    coffeescript

  5. SASS (Semantically Awesome Stylesheets)

    SASS is a language that allows you to write stylesheets that contain variables, nested styles, and basic logic. It essentially extends the abilities of CSS3 and makes the process a little bit more like writing javascript. SASS is used to power amazing CSS Frameworks like Compass. Using SASS can cut down a lot of wasted lines/characters in your stylesheets.

  6. Sublime Text Editor

    Sublime isn’t a web technology, but it is one of the best text editors I’ve ever seen. If you want something more gorgeous to look at while coding, download Sublime. It allows you to write code extremely fast and is filled with millions of shortcuts (including zen coding). It also has a plugin library so you can extend Sublime how you see fit. It comes highly recommended by most developers.

The New Design Theory and Design Process

  1. Photoshop is Dead

    Designers need to give up on Photoshop. If you are stuck in a static, bitmap editor, the web can be a frustrating platform to design for. Think about it, we are now in a pluralist device age… with its multiple browsers, re-sizable windows, and inconsistent data speeds. It is extremely frustrating as a Front-End developer to hear my superiors ask for “pixel perfection” pointing out a beautiful mock-up in Photoshop. The truth is, you are only wasting my time just to find out that it will never look as good in the 100 browsers I’m building it for. So, what can we do?

    As a Front-End developer, it seems silly for me to have to go into Photoshop to pull images, colors, and gradients from a PSD. The designer is already walking through that process. After they are finished, the PSD is handed over to me and then I have to repeat the same steps they took. On top of that, finishing a site usually requires a sit down between me and the designer to “pixel perfect” my site. Usually this take hours and sometimes days. Why can’t the designer build these simple mockups in the browser?  Even if they cheat a little using basic HTML templates and CSS Frameworks. Why do I have to repeat manual work the designer is already doing? Why can’t designers learn the craft and extend their skill set?

    Designers desperately need to turn into creative coders. Most designers shriek at the though of coding, but remember, writing HTML and CSS isn’t really coding, it’s scripting at best. Besides, if you are designing in a browser, you really only need to know CSS. My biggest problem with some designers is that they truly don’t understand how to design for the web. Many of them constantly request pixel perfection, yet they have no understanding of optimization, SEO and worst of all, responsive designs. You can’t design for a static 960px wide site. It just doesn’t make sense in today’s market. If designers would start embracing code and the browser, sites will get so much better. Imagine how much time we can save in both development and design.

    Granted we still may need to have a “comp”, but I think agencies are moving away from the traditional comp. It seems more likely that we move the design comp to the browser while working on a functionality prototype at the same time. Brad Frost’s blog post on this issue sums it up perfectly. The comments at the bottom are wonderful, too. He also wrote a nice article on A List Apart called, “For a Future-Friendly Web“. Let’s stop wasting time, and let’s get everyone back into the browser.

  2. Flat Designs

    As you can imagine, the responsive web is changing the way we design websites. French designer, Sacha Greif, has an amazing blog post on this very topic. It seems Window’s 8 finally did something right, their new Metro UI has the design world talking. We are bringing websites back to the minimalist Swiss style, putting more emphasis on usability and simplicity. And why flat designs? Because they work beautifully with responsive designs. Microsoft’s new Metro UI works on all of their phones, tablets, desktops and even X-Box. With the design taking more of a utilitarian role, it is much easier to build consistency across all of their platforms, apps, and sites. In addition, users are finally getting comfortable with our rich interfaces. Users don’t necessarily need the same degree of realism anymore. We are becoming more liberated as designers, a button doesn’t need to have a gradient any more.  Let’s start a better process for designing sites, and start creating a beter, more responsive web.

The Perfect Site

That was a lot to take in.  Let’s quickly recap what we’re going to use for our perfect site…

  1. Node.js
  2. HAPROXY Load Balancer
  3. MongoDB
  4. Content Distribution Network (CDN)
  5. Backbone.js
  6. Optimized Front-End
  7. Responsive Layouts
  8. SASS
  9. CofeeScript
  10. Flat Designs

With such a wide array of devices at our disposal, it only makes sense that we move to a faster web.  This can be a tough problem to solve and is heavily contingent on the size of a site and a user’s download speed.  Most of the bullets in the list above are a solution to this very problem.  Faster speeds, slicker interfaces, and “readable” content all make for a better user experience.  At the end of the day, we need to attract users as much as we want to keep them on the site.  Unfortunately, most agencies are not putting the emphasis where they need to.  It is imperative that we look to the near future.  Mobile is going to take over, and if your agency or site isn’t “optimized”, you’ll surely fail.  Take a moment to reflect on where we are headed, and where we are now.  Let’s build a more beautiful, responsive web together.

Feel free to leave a comment.  Do you have a better tech stack?  Is metro UI just another fad?  Are their better built sites other than Trello?  What do you think?

Add post to Blinklist Add post to Blogmarks Add post to del.icio.us Digg this! Add post to My Web 2.0 Add post to Newsvine Add post to Reddit Add post to Simpy Who's linking to this post?

Site Speed, Analytics, and SEO – Explaining the Differences 22 Jan 2013 6:42 PM (12 years ago)

Over the past year, I have spent a great deal of time researching site speed and SEO.  In the tech blogoshpere, the two can seem to be intertwined.  Some experts like to point out that Google now factors site speed in their search algorithms, but Google has made it clear that this factor only effects less than 1% of sites on the web.  Not to mention a ton of research exists that proves a faster site usually keeps users on the page longer and keeps them coming back more frequently.  But does this count as SEO?

While they have good intent, Google ensures that the speed trend will continue to be a big factor in rankings in the near future.  It is possible that Google is only announcing this new addition to their algorithms because of their speed testing browser plugin, Page Speed and its competition with Yahoo’s ySlow.. Or, is speed really a huge factor in SEO?  Site speed, analytics and SEO are all important aspects of the optimization pie and all three should be treated separately.

Keep in mind that these topics exist almost exclusively in the realm of the Front-End.

MOZ SEO

So, what is really going on here?  Does speed or page load times truly effect SEO?  Are we even using this terminology correctly?  Can we use them interchangeably as we have read so many time on the web?  Let’s try to get to the bottom of the terminology, when to use it and what it all actually means.

The Difference Between Analytics, SEO, and Page Speed

The Question

I often hear people use the terms analytics, SEO and page speed interchangeably to convey almost the same idea.  To explain this, I will give an example of a common question that leads to constant misinterpretation, “Can you make sure the site is optimized?”  This question can refer to optimizations for completely different aspects of a site.  I could fire back with a number of questions…  Should we improve back-end server requests so they are faster?  Front-End resources are minified, compressed, and combined for improved page speed?  Pages contain proper meta data, descriptions, keywords, key densities, image alt tags, and relevant content for SEO?  Does the head also contain OpenGraph meta data for social media and external linking?  Has the site been tested for security loop holes?  Is the site optimized for proper conversion, event, and general user tracking?

I could probably drum up a few more, but you get the idea.  This simple question can result in a flood of more questions.  What should we really be concerned with?

The Reality

The reality is, any site needs to cover all of those “bases” mentioned in the response-questions above.  Too often account leads or managers tend to bucket everything under the SEO or analytics titles, when in reality, optimization is more of a general term that can refer to a whole gambit of problems or concerns. Page speed actually has little to do with SEO.  The definition of SEO (or search engine optimization) is, “The the process of affecting the visibility of a website or a web page in a search engine’s “natural” or un-paid (“organic”) search results”.  Searching that quoted Wikipedia article for the term, “Speed” returns zero results.  ZERO.  As you can see, SEO is strictly about a site’s ability or position to rise in Google’g organic search queries not how fast it can load all of its resources to display the page.

Google Analytics Screen Shot

Again, while good SEO and page speed can help return positive numbers in Google Analytics, they are not Analytics.  Google Analytics is simply a tool to monitor user trends and data that live within an ecosystem of a website.  SEO can increase traffic, but it might not increase average times spend on pages within the site.  Likewise, page speed can increase user returns and the time spend on pages, but it has almost nothing to do with a site’s overall rankings in Google.  If you have looked around the Google Analytics menus, you will notice that page load times are not tracked.  Chances are, if you spend a great deal perfecting all 3, Site Speed, Analytics, and SEO,  you will be able to turn your website into a well oiled machine.  Generally, the three do live in a symbiotic relationship, but remember, Analytics can only be a tool that can tell you how much your SEO or speed optimizations have improved user flow to and within your site.

The Takeaway

Keep in mind that site optimization can include all 3 categories discussed in this blog post.  A developer can optimize a site for speed, search engines, social media, database quires, code bloat, etc.  Don’t sound like an idiot and confuse them.  After all, they are 3 very different monsters.

Additional Research

Check out this quick fire list of helpful links to keep your site polished and optimized.

  1. General Information on SEO
  2. Misconceptions on SEO
  3. Geeb’s WordPress Optimization Guide
  4. Book of Speed
  5. WordPress SEO by Yoast (good read for non-WordPress Site’s, too)
  6. Huge list of resources for Google Analytics

Feel free to post your favorite links on Page Speed, SEO or Google Analytics in the comments section below.  Thanks for stopping by!

Add post to Blinklist Add post to Blogmarks Add post to del.icio.us Digg this! Add post to My Web 2.0 Add post to Newsvine Add post to Reddit Add post to Simpy Who's linking to this post?

Front-End Retrofitting and the Decline of Enterprise CMS Solutions 21 Jan 2013 4:27 PM (12 years ago)

Over the past year I have seen a huge trend emerging from my day-to-day projects at work.  More and more, I have to create small widgets, ad systems, or experiences that are injected into a site with pure javascript (usually with JSONP to accommodate cross-domain security issues) as oppose to building them as a part of the site’s Content Management System.  The big difference here; one is solely a Front-End execution while the other is 50-50, Back-to-Front.  As anyone in the field would know, for mostly SEO and data consistency, it is very important to have content on the page prior to load.  Obviously, loading an entire experience via javaScript makes the page invisible to search engines.  The reasons for building a page or experience this way can vary, but it is happening more and more.  It’s not always the client’s fault, but most developers would tell you that the pure Front-End solution is not the right one.  So why are we constantly having to do it?

Why are we Ignoring the Back-End?

So, who is to blame for the abundance of Front-End widgets and experiences?  For one, most old corporations run on a JSP (java) based systems like, IBM’s WebSphere or Fry’s platform.  They typically do not have recent versions of the CMS, because these upgrades can cost in the millions of dollars.  Any highly specialized CMS also creates an inflated budget dedicated to site maintenance and updates.  Back-End developers who only work in languages like JSP can cost an arm and a leg, too.  To most technology advisers, the cost of an upgrade and even general maintenance is not worth the reward.  Besides, a sluggish or near impossible multi-site update could be dangerous, and even hurt sales.  I have a personal opinion on this, because I work on sites like these, but I will save for later.  The truth is, old systems often provide a lack of access to the sits’s templates, or they just do not have the necessary features to get the job done.

sftp

Show me the Access

This brings me to my next point; access.  Generally, these CMS admin panels are setup for basic content manipulation.  This would include ad swaps, image uploads, page creation (not in all cases), and content rework.  In most business relationships, huge corporations cannot justify granting complete access to agencies to work on their sites.  They believe a low-level CMS login is efficient access for any project.  This usually stems from a lack of communication and knowledge on the system the client is actually managing.  Generally, business decisions are made without the approval of an IT Director or Developer and this tends to produce projects that are usually not possible when they are scoped.  Because of this, we (the agency) must assume that this work is limited to the Front-End only.  After all, we cannot access the Back-End without SSH or at least, SFTP.  We (the agency) must also assume that the client understands their system, the differences between Front and Back-End development, and the consequences of each project completed solely in the Front.  The reality is, the client does not.  Because of this, agencies are almost never granted sufficient access to complete projects that should require both Front and Back-End development.  But, is it all just about lack of technical understanding?

Lack of Knowledge, or just being Protective?

Corporations are overly protective over their site, but with good reason.  Handing over SSH credentials to an agency could be a risky move.  By doing so, you hand over the entire site(s) template files (CMS).  The next level, handing over the database, would be even more risky.  If you were an e-commerce site, this would release all of your product data, sales, user e-mails, etc.  If a corporation wasn’t working with a reputable agency, they would never consider doing this.  Besides, while corporations tend to hire agencies to manage ads, and content, they often already have an IT / Back-End team in place.  For business, it doesn’t really make sense to them to grant this high level of access to any agency under the sun.  This type of thinking tends to drive most of the business decisions between the corporation and the agency.

Naturally, agencies are desperately trying to achieve this level access with all of their clients.  Some do not have the staff to work on such projects, but if they signed new business for this type of work, any agency would fill their roster in a heartbeat.  The bottom line; if they can own a site, Back-to-Front, it would bring more than enough business for the agency.  Often times, system developers are hired again and again for general maintenance and site updates and this creates a steady flow of cash.  This is also good for the corporation because it will costs much less (we are talking hundreds of thousands less) than choosing a big box solution like IBM.  As a side note about cost, Back-End work always costs more than Front, however, Front tends to come in greater volumes.

What does this all mean?

In a nutshell, this is where the problem currently sits for many small to medium sized agencies.  Corporations are stuck with an old “enterprise” level CMS’s, which require super specialized professionals to maintain and upgrade.  This drastically inflates cost, and places distrust (and rightly so) in the small agency that wants the complete business.  Because the business doesn’t completely understand their own system, they ask agencies to complete projects that are almost never sound technically.  The agency, who desperately wants this business with a big client, will ignore some bad practices with Front-End only projects (as I have explained earlier) to keep a positive repertoire with the client.  This allows the agency to prove how inflexible their client’s system is, in hopes of coaxing them into building something new.

The reality is, the corporation cannot justify the cost for this, and would rather continue with workarounds and small upgrades.  This usually results in old, frankenstein sites with terrible SEO and are filled with JSONP, Front-End widgets that only exist because of a browser security loop-hole.  It is astonishing that entire sites are almost being replaced by these types of widgets.  Don’t get me wrong, JSONP allows for amazing flexibility across the web, and puts content on any site regardless of the CMS it runs on.  It just seems remarkable that they are becoming ubiquitous based on a simple lack of access.  This problem also helps promote the idea behind my last blog post, The Rise of the Front-End Developer.

How can we get past the stalemate?

The abundance of Front-End retrofitting proves the decline of enterprise CMS solutions and a new adoptions of open source PHP platforms for old big box retailers.  Even from personal experience, I know some clients are getting annoyed with their old enterprise solutions.  They are starting to realize how fast and flexible a small agency can be.  The truth is, open source PHP platforms are king and are blasting past the enterprise.  Their is almost no excuse for going with someone like IBM anymore.  They are an old dinosaur.  Simply put, I feel bad for old corporations who have gone the JSP route.  These sites are always a disaster and they cost millions.  The good news is, that the tide does seem to be turning.  Despite seeing my fair share of let letdowns last year, I strongly believe a shift is coming in 2013.  Long live the open source Front-End.

Tell me what you think.  Leave a comment below or message me on Twitter.  I’d love to hear from you!  Thanks for stopping in.

Add post to Blinklist Add post to Blogmarks Add post to del.icio.us Digg this! Add post to My Web 2.0 Add post to Newsvine Add post to Reddit Add post to Simpy Who's linking to this post?

Rise of the Front-End Developer 9 Dec 2012 3:33 PM (12 years ago)

Back in April I posted about the Rise of Mobile and Mobile Commerce.  In this post I stated that by 2015 mobile devices will exceed 50% of traffic for most websites.  While this has a huge impact on how we build sites today, it is safe to say that this will make Front-End Developers in even greater demand for the next couple years.  I’ve had a moment to reflect on this post and have continued to track other trends on the web.  The truth is, we are about to witness a huge paradigm shift on the web.  In my recent exploration and research, I am finding even  more reasons to believe that we are about to see the Rise of the Front-End developer.  We are seeing not just a “mobile” shift, but we are also a shift from Back-End oriented work to the Front-End.  And I don’t just mean workload, I also mean how sites are built, how they are coded, and how the Front-End interacts with the Back.  In other words, tasks that were previously done by a Back-End developer are now moving to the Front.  This is not only caused by the rise in mobile, but also a change in technology and the progress that has been made with javaScript based MVC/MV* frameworks.  However, before I get into details, it is first important to understand the current obstacles we still face with web technologies, search engines, browsers, and server capabilities.

The State of the Front-End and Web’s Current Obstacles

First, let me state that we live in a technologically pluralist age.  When it comes to the web, users tend to have an arsenal of devices, all of which have different screen sizes, connection speeds, age, browsers, and basic capabilities.  The mere fact that Android devices have the largest mobile market share with an open source OS tells us that this trend will continue.  In addition, one of the biggest technological black-holes is still Internet Explorer.  Unfortunately, IE still holds ground with size-able amount of traffic to hold back HTML5 and CSS3 standards from being truly ubiquitous.  We do see some progress being made in the browser wars.  Chrome did overtake IE as the most used browser this year.  This is a huge sigh of relief for all developers.  Even bigger news, jQuery 2.0 is also dropping support for IE6-8.  This will allow for faster more agile web apps that are not bogged down with legacy code and support.  Truly, IE is dying and the web is finally able to evolve.

From a development standpoint, progressive enhancement, responsive and fluid designs are trending right now, and are all almost necessary for any site to survive.  However, these design trends are not a complete solution for an age of pluralism and devices.  We still have a rift between proprietary apps vs. web based apps and the languages they are written in.  By this I mean, some phones allow for apps that exist purely on the web coded in common scripting languages HTML, CSS, and JS.  While others can only be done in .NET, C#, Objective C, or Python.  Again, we seem to have an inflated number of coding languages that are used to build web-based apps and interfaces.  This desperately needs to change in order for the web to truly evolve.  I foresee the app market that allows for pure web apps will eventually beat out the competition.  In case you are not following along, this means the Apple App store will not win.  Developers shouldn’t have to learn a million languages to build simple user interfaces and experiences. Overtime, we won’t have to.

This paints a horrendous picture of the web, and specifically, a herculean job for the Front-End developer.  The pluralist web has made our jobs extremely difficult, and while PHP seems to be dominating the Back End world, we won’t see progress in the Front-End until 2015 (This will be the year that mobile devices will overtake 50% of traffic on the web.)  With IE dropping off the map, ubiquitous device support for HTML5 and CSS3, 2015 will be the start of the Front-End’s web.  Of course their is more to this story than meets the eye. Mobile (I include tablets in the term mobile) isn’t the only big milestone to consider.

Where is the Web Going?

A huge trend in web applications is happening on the web right now.  More and more, sites are being developed as “single page applications” or SPA’s, using  javaScript based MVC/MV* frameworks (backbone.js is a popular one).  This means users never “refresh” the page, and all content is generated via javaScript and AJAX.  The browsers natural URL procedure is thrown out, and navigation now acts as content routers.  These types of web applications almost mirror the glory days of Flash Development, but the only difference here is, we no longer need plugins or extensions to achieve these experiences.  And just like the days of flash, these types of applications have huge impacts on SEO and Back-End architecture.

SEO Implications

As we already know, it is almost imperative to have knowledge of SEO prior to building a site.  Appearing on Google’s search results pages can bring a site up from the depths of the long tail of the web to super-stardom.  If you can dominate search, you don’t have to spend a dime on SEM to generate traffic.  Obviously, social is starting to change the game, but search still dominates site traffic by a long shot.  Let’s get back on topic… what problems do SPAs pose for SEO?  Since SPA’s content is driven purely by JS/MVC templates, and search bots don’t render Front-End JS, SPA sites are pretty much invisible to search engine crawlers.  This seems like a huge sacrifice to make in order to build a slicker experience.  This type of development would almost never be considered, however, huge sites and companies are following this development philosophy.  Check out this list and see for yourself.

So why are big names building SPA sites?  Luckily, developers have found ways to get around the SEO wall with SPA’s.  This great article from Thomas Davis at backbonetutorials.com gives insight on how to to build SEO friendly SPA’s.  Essentially, you can combine node.js, phantom.js and backbone.js to deliver pre-rendered content pages for search engines that mirror pages seen by users after all scripts have finished rendering.  This does raise questions about SEO ethics, black-hat SEO, and tricking crawlers.  However, as Davis argues, old SEO practices should not dictate how websites are built.  We are finally at that moment in the life of the web.  Search needs to reinvent itself for the new Front-End web, not the other way around.  As I will state later in this post, javaScript is king and is the real backbone of the web.

What is Node.js and what does it mean?

Node.js has potential to be a catalyst for the age of  the Front-End Developer.  Essentially it is an attempt at a javaScript based Back-End, server-side language.  To repeat, Node.js turns Back-End development into a Front-End language.  Obviously, Node.js is still in its infancy, but it is already being widely accepted and in a few years, you will start to see some amazing SPA’s that bring user interaction to a new level.  Node has amazing potential, and I am excited to see it take off.  If you want a better understanding of Node.js and its potential, read this article.

What Does all this mean?

This means, the web’s designers, Front-End Developers, are now in control (not complete control, but a shift of power is starting) of the Back-End with the same language they coded the Fron-End with.  Back-End architecture will have to change.  Back-Ends will become a lightweight CMS that house a powerful API.  Their only task is to serve data encoded in JSON to javaScript templates that load data via AJAX.  PHP already does this very well, and is the most used back-end language in the world.   It is safe to say, the web will still be powered with PHP, with Node.js as a contender.  Finished renderings (after scripts run) of each virtual “page” of a SPA will be cached and loaded specifically for search bots.  CMS’s wont be template engines, but simply JSON content creators.  Templates will all be in javaScript based MVC’s purely in the Front-End.  This will bring us into a world where each project will need 3 Front-End developers for every 1 back-end.

Again, once we get passed IE’s limitations, the overlords of SEO, and allow javaScript to be the backbone of the Front and Back-End of the web, we will see the Rise of the Front-End Developer.  I hate to sound repetitive, but 2015 will mark the start of this paradigm shift.  Agencies need to realize this to survive.  The world of the web will be going through an exciting change.  In a nut shell, the web is becoming more standardized, accessible, and all the while, still evolving.  I look forward to this moment because it will allow developers to be truly creative again.  We won’t have to be held back by technical limitations, we only need to understand one language.  The web is my canvas and JavaScript is my paint.  This is the the future of my web.

 

Add post to Blinklist Add post to Blogmarks Add post to del.icio.us Digg this! Add post to My Web 2.0 Add post to Newsvine Add post to Reddit Add post to Simpy Who's linking to this post?

My Night at the CIMA’s 30 Oct 2012 9:22 AM (12 years ago)

A few weeks ago I found out cookmore was nominated for a CIMA.  The site has only been public for a couple weeks, so being nominated after being live for such a short time period is a boost to the ego to say the least.  CIMA,  Chicago Interactive Marketing Association, holds an awards ceremony once a year to honor those in the business who have gone above and beyond in the realm of interactive and creative agencies in Chicago.  They only give away a handful of awards a year and the competition is fierce.  As they state on their homepage, “CIMA is Chicago’s only interactive-centric professional organization dedicated to the enhancement and acceleration of business opportunities, professional development and exponential networking for the interactive marketing professionals in Chicago.”  Let’s move on to the actual night of mayhem.

The Ceremony

The ceremony was held at the beautiful Renaissance Hotel in downtown Chicago.  I had never been in there before, but I do remember seeing it in my past travels in the city.  Agencies from all over Chicago land mingled in the lobby of the ballroom, where we were served appetizers along with 3 open bars.  After mustering up some liquid courage, we were called into the ballroom to start the ceremony.

As you may already know from my frequent postings on Facebook, cookmore.com was nominated for “Best B2C (Business to Consumer) Website and yes, we took home the CIMA!  Wo0t!  Last year ARS was nominated for 3 CIMAs and took home all 3.  Unfortunately, I was not able to be a part of these wining projects.  I had just started to work at ARS when the ceremonies were scheduled.

I am so fortunate to have had the opportunity to work on this project with my talented team members.  I was the the sole Front-End developer on the project, and I am proud to say that I owned this thing.  Next year we hope to bring in 2 or more.  I know we can do it.  I can’t believe that at my first opportunity to win a CIMA, I won.  It feels amazing to know that hard work really does pay off.

Oh, and check out site for yourself at cookmore.com.

Kuma’s Corner | After the Ceremony

When we all had our minute of fame it was time to move on to the next venue.  Riding the high of the big win, we wanted to keep the night going to celebrate our victory.  We headed to a metal dive bar called Kuma’s Corner (they have a terrible site, so be forewarned).  Kuma’s is known for their amazing burgers which all have metal-oriented names.  One of the guys ate a Metallica Burger, and another the Plague Bringer.  You order the burger “as is” and their are no exceptions.  Burgers are not the only thing on the menu, my table also ordered a large plate of mac’n cheese which was out of this world.  I will say it was the best mac’n cheese I’ve ever experienced.  They also have a large selection of craft beers from around the nation.  Not that I have to say this, but don’t go there if you are on a diet.  The burgers are massive.

End of the Night

Overall, it was a magical night.  It was a lot of fun and it felt like reward for all the hard work everyone put into the project.  I wanted to thank Ronda and Kashif for taking us out and to the rest of the team that worked on cookmore, Mel, Bob, Beata, Dave, Matt S., and Matt J., you guys rock.  A well deserved victory for all of you.

Add post to Blinklist Add post to Blogmarks Add post to del.icio.us Digg this! Add post to My Web 2.0 Add post to Newsvine Add post to Reddit Add post to Simpy Who's linking to this post?

Cookmore.com 28 Oct 2012 7:02 PM (12 years ago)

Last month I dedicated my life to a new site I was building at work, cookmore.com.  This project was by far the most intense project I have experienced so far at ARS.  We were given very little time to complete a fully featured site.  With only one and a half months to go from wireframes, to prototypes, to a fully interactive and live site, cookmore.com was a major test to our team’s abilities to work under pressure.  I’m proud to say that everyone kicked butt and the project went smoothly.  We even completed more than the client originally requested.  Cheers to everyone who fought hard for this project to come to fruition.

I’d like to take a moment to thank Kenmore for taking a chance with us.  Trusting us with this project is a huge step in a long and lasting partnership.  I excited for the next big project from the Sears brands’ teams.

Features | Cookmore

Responsive Design

Cookmore is a completely responsive site that works across all browsers and devices.  Play with it on your smart phone, load it up on a traditional browser, or even a tablet.  The site responds to portrait and landscape orientations for tablets, too.  Give it a try in a traditional browser by resizing the window in real time.  Recipe and cookbook blocks re-size and reorder columns automatically for a slick user experience.  Font’s and images scale dynamically, and, oddly enough, during the site first phase release, it was designed to be 100% fluid.  At one point it was both fluid and responsive.

User Generated Content

Anyone can sign up for cookmore, create recipes, share them on the web, and even group them together in digital cookbooks.  Users can manage their profiles, reset passwords, upload photos and much more.  Feel free to offer suggestions, too.  We are always looking for new ideas.  Oh, and don’t forget to click on a heart on those recipes you love so much!  Loving recipes can push popular recipes to the homepage, so make sure you participate.

WordPress CMS

Cookmore works off of a custom WordPress theme and utilizes WP’s powerful admin back-end.  Site managers can edit recipes, create ad-zones, manipulate the homepage, change user roles, and more, all from the back-end of the site.  We also made sure we never edited the WP core, so our site can auto update to the latest versions of WordPress till then end of time.  This also means we can take advantage of WordPress’ enormous plugin repository.  We like to build everything in house, but just in case, we can download a plugin and install it in less than a minute.  Simplicity in UI/UX is what makes WP such a great CMS.

Custom API

As if that wasn’t enough, cookmore also has a custom API.  You can query recipes and cookbooks, log users in, display search results and much more with live AJAX requests.   We can even call display this data on other sites.  Believe me when I say, this thing is powerful and makes development a lot of fun.  Props to my partner in crime, Matt Stypa for developing the back-end of cookmore.

The site makes best use of our API on the recipe listings pages.  These types of pages use infinite scroll and live category filters to load and sort content.  Give it a try!

Takeaways | Cookmore

While working on cookmore, I did learn a lot.  I wish I had more time to dedicate to certain parts of the site, but a deadline is a deadline.  Responsive design was something I wanted to spend a lot more time doing.  I knew how to code it, but I had never implemented it on a new site from scratch.  Likewise, I also improved my knowledge of PHP and building custom WordPress themes.  It’s amazing to think where I was (as a professional developer) only a year ago.  Working with Matt Stypa was a great experience, too.  He helped push me along when ever I was stuck on the more back-end oriented work.  I hope I get to work with him on the next big project.  Cheers my friend!

ARS was nominated for a CIMA award for cookmore and we will find out tomorrow if we won.  Keep your fingers crossed.

Special Thanks | Cookmore

Big thinks to Dave Janes for being my hero.  He helped me from time-to-time on the front-end responsive design when I just didn’t have the hours to dedicate.  Dave, you are an amazing creative director, and I very much appreciate your time and talents.

 

cookmore team | ARS

Our team holding out golden spoons while in chef attire. I received the “Hard Core” spoon honoring the long nights I dedicated to the project.

 

Add post to Blinklist Add post to Blogmarks Add post to del.icio.us Digg this! Add post to My Web 2.0 Add post to Newsvine Add post to Reddit Add post to Simpy Who's linking to this post?

Geeb Turns 25 14 Jun 2012 11:18 AM (13 years ago)

Hey All!  Geeb here, to give you a status update.  It has been at least a couple weeks since I’ve touched my site.  My inactivity on geebart.com is the direct result of being busy and the fact that Diablo3 (D3) flat out rocks my world.  Sadly, I am going to talk about this a bit, D3 that is, so strap on your nerd helmet and enjoy.

Diablo 3 and the RMAH

Currently, I have one character, a level 52 Barbarian.  Needless to say, he is my money scrounging tank.   While the gameplay is fun, D3 is constantly undergoing changes with a steady stream of updates/patches.  The good news is, the game is constantly being refined.

Recently the RMAH (Real Money Auction House) was released.  Yup, thats right, you can buy and sell digital items with real home grown American Dollars!  Of course, to extract your hard-earned coinage, you must pay a one dollar fee Blizzard transaction fee, plus a cut to PayPal at 15%.  And, yes, you must use PayPal to get $$$ out of the game.  Don’t be confused here, there is another, standard version of the auction house that uses in game currency.  Unfortunately, I haven’t used the RMAH yet.  Users who purchased the game digitally (not on the store shelf), have to wait 3 days before they are allowed to participate in auctions in the RMAH.  I’m not sure why, but it’s only 3 days.

In the mean time I have been scrumming up all types of items and gold in the regular auction house.  I have a stash filled with rares sitting on deck for entry in the RMAH.  I am hoping that I can dominate easy buys in the regular auction house and then sell those items in the RMAH.  I’ll keep you posted on how that works out for me.

So, what’s the big deal with the RMAH?  Well, almost any MOMRPG has had the same problem…  no matter how you try to combat it, users will create black markets in order to sell digital items, currencies, and even characters/accounts.  Diablo 3 marks the first time in gaming history where the game developers have built a real money auction house to buy and sell digital goods.  It seems like a radical experiment, but the truth is, it was inevitable.   Any tech snob will tell you that if you can’t beat a hacker, join them.  In other words, embrace demand, don’t try to fight it.   This is especially true with internet communities.  We almost always get what we want :) .

I think it is a bold move for Blizzard, and I think all gamer should be extremely excited about the RMAH.  It marks a new zeitgeist in the gaming world.

Birthday Status

So… on June 19th, I will be a 25 year old white male.  As I have told my mom, I am now  a quarter of the way dead (assuming I will live longer than everyone else in my family).  It is kinda strange to think about.  I’ve also held my first job out of school for almost a year now, only a few months to go.  Im still living at home, *sigh*, but I’d like to try and move out by the end of the summer.  Student loans are the dredge on my soul at the moment.  I’m not sure if its the fact that I have so much longer to go till their gone, or if its the fact that I can’t remember where all of that money was spent.  Either way, it is extremely depressing.  6 years of college can destroy a wallet.

 

The good news is, I am continuing to improve my web skills.  At work, I’ve started to do a lot of work on JSON feeds.  This is basically a data feed that can be captured with javaScript and then reproduced on a basic HTML page with corresponding styles.  Sounds complicated but, as an example, it allows me to make cool experiences like this, Father’s Day Bundles on Craftsman.

Business Begins

Back in april I mentioned that I would begin working on making WordPress plugins and themes in attempt to try to make some $$$.  With business and the release of D3, it is safe to say, I haven’t had much time to explore this venture.  I am still determined to accomplish this goal, it will just take slightly longer to get started.  Besides, leveling my D3 character is important, right?  Hah!

In all seriousness…  a break from geebart was good for me.  Freelance has been streaming in, and it has allowed me to keep my mind on other things (like D3).  After my birthday weekend is over, I would like to get started on some serious development.  From website ideas, to geebart.com, to side jobs, I always have something to work on.  When I move closer to work, I will reclaim at least 1.5 hours a day that I waste on my commute.  That’s a total of 10.5 hours a week!  I cannot wait.

Anyways, if I am ever frazzled, buy me a beer or something…  I’ll owe you one 😉

Thanks for stopping by,

Geeb

Add post to Blinklist Add post to Blogmarks Add post to del.icio.us Digg this! Add post to My Web 2.0 Add post to Newsvine Add post to Reddit Add post to Simpy Who's linking to this post?

How to Get an Element in the Absolute Center of the Page 24 Apr 2012 5:38 PM (13 years ago)

Every developer has come across this problem—how do you get an element in the absolute center of the page? And, I don’t mean the center horizontally… I mean vertically as well. Using a margin:auto; to center horizontally, is old news. So, what if the content can be set to any size width or height? What if this object also has padding that is added on to its natural width? As you can see, positioning this element/content in the dead center of the page can be difficult. Can it be done with all the extra bells and whistles like padding? I am proud to present my, “How to Get an Element in the Absolute Center of the Page” tutorial.

how-to-get-an-element-in-the-absolute-center-of-the-page

The Wrapper and margin:auto;

First let me setup a basic HTML document to use for testing.
<html>
<head>
<title>How to Get an Element in the Absolute Center of the Page | GeebArt</title>
<style></style>
</head>
<body>
<div id="absCenter">
<div id="wrap"><#innerWrap><span>Absolute Center</span></div></div>
</div>
</body>
</html>

I included a style block in this HTML only because the amount of CSS we will be using will only be a couple lines long in total. Before we get into positioning an element horizontally, let’s first remove some styles from the classic <body> tag. (Most browsers have default styles for elements and this is always true for <body>.)
<style>
body{width:100%;float:left;height:100%;padding:0;margin:0;}
</style>

Next, let’s add styles to our outer-most content wrapper, “#absCenter”. The goal here is to place this content above everything else on the page. To do that, we need to give it a 100% width and height, float it left and give it a position:absolute;. In some cases (especially for IE) you may also have to give this element a left:0; and top:0; to make sure the absolutely positioned element stays stuck to the far left and top side of the page. Let’s add some styles…
<style>
body{width:100%;float:left;height:100%;padding:0;margin:0;}
#abscenter{float:left;width:100%;height:100%;position:absolute;top:0}
</style>

Next, lets create a container that will be the outer wrapper of our content. At this point we are going to define a width for this element. It could be a fluid like a percentage based width, but I am going to use a fixed amount for this tutorial, “400px”. You will also have to add a, “display:table” to this element. Yes… the good ol’ table from the 90’s and early 00’s.
<style>
body{width:100%;float:left;height:100%;padding:0;margin:0;}
#abscenter{float:left;width:100%;height:100%;position:absolute;top:0}
#absCenter > #wrap > #innerWrap{float:none;width:100%;height: 100%;background: #822;display: table-cell;vertical-align:middle}
</style>

Why a table? Well table cells are actually the only elements that can be centered vertically. Crazy, right? It is TRUE, and you will see the magic after adding the last bit of CSS. Don’t forget to add, “text-align:center;” to make your text dead center.
<style>
body{width:100%;float:left;height:100%;padding:0;margin:0;}
#abscenter{float:left;width:100%;height:100%;position:absolute;top:0}
#absCenter > #wrap > #innerWrap{float:none;width:100%;height: 100%;background: #822;display: table-cell;vertical-align:middle}
#absCenter > #wrap > #innerWrap > span{background:#888;display: block;width: 100%;text-align: center;padding:10px;/*-moz-box-sizing: border-box;-webkit-box-sizing:border-box;box-sizing:border-box*/}
</style>

You may notice that the, “Absolute Center” text is hugging its content box borders. Naturally you will want to add padding to give it more space. To prevent this box from going off-center in all browsers (in conjunction with using padding), you may need to uncomment the “box-sizing” styles. Padding is usually added after width has been declared, but with box-sizing set to border-box you can make the padding be included in the width. Pretty neat.

Unfortunately, this little CSS gem does not work in IE7 and lower. IE8 and 9 are supported! If your site is starting to lose IE7 browser visits, you may want to consider this method. Their are always javaScript/jQuery workarounds if this method does not suit your old browser needs.

That does it for my, How to Get an Element in the Absolute Center of the Page tutorial. If you have any questions, feel free to leave a comment below. Thanks for stopping by!

– Geeb

Add post to Blinklist Add post to Blogmarks Add post to del.icio.us Digg this! Add post to My Web 2.0 Add post to Newsvine Add post to Reddit Add post to Simpy Who's linking to this post?