Getting Better

Black Chair (and its collective parts) has (/have) been building websites and web applications for a long time. We’ve seen a lot of change in the landscape of web technology, and we’re bound to see a lot more. We’ve also seen a lot of change in our own business, and in the way that our clients choose (or are forced) to interact with the internet. Recently I’ve been thinking a lot about where that leaves us, ie. how I would advise us if we hired ourselves as consultants. If anyone happens to read this, I apologize both for the unformed mess (I’m trying to force myself to write publicly more often even if it means writing with less polish) and the technical mumbo-jumbo at the end—this is as much (if not more) for my own benefit than for anyone else.

Anyway, what I would say if I were consulting my own company:

  • We need to build tools that help us in the same way we’ll often build tools to help our customers.
  • In an industry that is constantly in flux, we need to compartmentalize our tools and processes as much as we can. We need to set ourselves up in such a way that any part of our workflow or our work product can be upgraded at any time without affecting the rest of the system.
  • We need to deliver these advantages to our clients in the form of “continuous deployment”. Keeping our clients ahead of the curve needs to be in our DNA, and we need clients who are willing to trust us with that. When that happens, we’ll be able to continuously refine our portfolio with the knowledge that we gain (and as a reflection of the ground moving beneath us), and schedule regular upgrades of varying size with all of our clients.
  • We need to choose a set of tools and platforms which are robust, reusable, and relatively future-proof, and we need to stick to that set as much as possible. One of the challenges with web development is the fragmented ecosystem, where the common tools for building a marketing site don’t really align as nicely as they could with the tools used for building an e-commerce platform or a real-time communication platform, but if we can find the right levels of overlap we can master a small, flexible set of tools instead of clumsily wielding several different ones. (This sounds counter-intuitive because “tool” conjures what I think is an inappropriate analogy, where “using the right tool for the job” requires a garage or basement full of tools. “Instrument” may be more appropriate here, and few musicians are able to master more than a few—this approach can only be scaled by working with masters of different instruments as is the case with a band or an orchestra… or a company)

Some of the requirements we’ll commonly need to meet include:

  • Multi-platform access, ie. exposing functionality and resources to users across a wide range of operating systems and devices, both through a web browser and as native apps.
  • Real-time messaging. Events frequently need to be propagated throughout a distributed system.
  • Public resources must be easily crawled, ie. by search engines.
  • Web sites will require an intuitive content management interface.
  • In general, user interfaces must be aesthetically beautiful as well as extremely functional. This will often require re-inventing the visual style of a product over time.

Some general thoughts about how to meet those requirements while also keeping the first set of points in mind (none of these are novel, I’ve just cherry-picked existing ideas which make sense for our requirements):

  • There should be strict separations between data, the rules for accessing that data, the logic by which it is manipulated, and the logic by which it is displayed. This will facilitate upgrading any part of the system as discussed earlier.
  • Most data should be stored without a pre-imposed structure, and therefore it should be consumed without making any assumptions about its structure. That way the structure can change over time. (Some data needs structure; it should be identified as early as possible.)
  • As a corollary to the above, most data should be accessible through ad-hoc search queries. This is as true for searching the content of a website as it is for segmenting time-series or relational data.
  • Content delivered over the web needs to be loaded very quickly. We should be able to do far better than the status quo if we approach this thoughtfully.
  • Data should be cached locally as often as possible, to minimize network requests. This is as true for the application server (minimizing requests to the database server) as it is at the client (web/mobile/etc).

A suggested approach, which is an extension of our current direction:

  • Standardize around the node.js ecosystem. This is where much of the development around real-time systems seems to be focused, and it allows for maximum code re-use. NPM facilitates automatic updates and controlling versions of dependencies.
    • Consider using a specialized node hosting service or manage AWS very well.
    • There will be exceptions where Python makes more sense, since its libraries are more mature, ie. legacy integrations, machine learning, etc.
  • Serve everything over SSL. (It’s not that expensive, better for user privacy, and easier than fighting with certificates anew whenever the need arises.)
  • Serve static assets over a CDN like Cloudflare.
  • Build API-first to separate data and access control from application and display logic. This will make it easier to build for different platforms (ie. mobile apps) later.
    • Use CouchDB hosted on Cloudant to help with this. Not only does it provide an HTTP API out of the box (with authentication and full-text search included), but it scales well for projects which require large databases.
    • When an SQL database makes sense, use Postgres because it is currently the most reliable open-source relational database.
    • For real-time components of the API, use socket.io
  • Use Airbnb’s Rendr and Backbone to speed up page loads and facilitate search engine crawls.
    • Cache data server-side using Redis, and use Redis exclusively for projects with small data sets. This will enable incredibly fast load times.
    • Use PouchDB client-side whenever it’s appropriate, to minimize the amount of data that needs to be sent over the wire. This is especially helpful with mobile connections
    • Use Lunr.js for client-side search (in conjunction with PouchDB, presumably) to further reduce server requests.
  • Develop user interfaces using UI frameworks like Foundation or Pure CSS. This will help keep future cosmetic changes as simple as possible, and facilitate responsive (mobile-friendly) designs.

So, in summary, the goal is to find a happy middle ground between the tools and the processes involved in building a static HTML website vs. a CMS-driven website vs. a web application vs. a full-fledged web platform with web and mobile apps connecting to it in real-time. The approach would allow us to standardize around a single set of tools while giving us the flexibility to use the proper combination of them for any project, and to easily expand upon existing projects as required (as opposed to starting over with a new technology).

davidnoel

Software developer, book writer, beer brewer :)

No Comments

Leave a Comment

Please be polite. We appreciate that.
Your email address will not be published and required fields are marked