Architectural changes in web applications
For an ‘old-timer’ like me, who witnessed the birth of the web, and the adoption of the Internet, it’s been a challenge to unlearn some ‘rules-of-thumb’. I’m listing some of these as food-for-though for others who follow a similar technical path.
Moore’s law has mutated. Technology is no longer about boosting speeds and capacities. Gone are the days of the break-neck races between Intel and AMD to achieve higher Gigahertz in their CPUS. The new reality is all about parallelism, multiple-cores, caching layers in architectures (typically via memcached), formal and informal means of splitting data across multiple machines (i.e. sharding, load-balancing, map-reduce). Any non-trivial architecture that requires massive scalability has to build in the capability for synchronizing across distributed server components:
Assemble battle-tested components, rather than build a proprietary stack. I’m surprised that people who are learning to program are still taught to use linked-lists, and spend time at the data structure level. Most developers will never need this granularity of understanding, and will simply plug in the data structures from the C++ Standard Templates Library, Java Collections Frameworks, or whatever language they prefer to use. Obviously, this low-level knowledge is very useful if you’re working in an area that needs it, but frankly, the majority of developers do not need it. The existence of Service Oriented Architectures actually makes it possible to ‘plug-into’ remote processing capabilities that are no longer even managed by your team. Cloud computing has also taken this to another level. Amazon’s EC2 is not a bizarre anomaly, but a celebrated part of the mainstream now.
Database normalization is passe. There was a time when people bragged about how normalized their DB was. It was a time when purists reigned. Nowadays, unless you’re tracking the world’s financial data, you don’t need that level of normalization. It’s a sign of a confident developer, when they purposefully denormalize parts of their database, to speed up the database access, and reduce the burden on their server. It is possible to do this without running into excessive redundancy, stale data, and integrity problems. The art is in knowing how!
REST is available for almost free. There are a number of development frameworks that allow your web application to be offered almost immediately using the SOA model. Sure, you’ll still have the human web interface, but by compiling on RubyOnRails, you get the ability for others to query your web applications as if they were remote components in their system. This reduces the interface rendering processing, and allows for collaborators to develop reliable system that are integrated-via-contracts to your system.
These are the most interesting paradigm changes that have taken place in web architectures. Any comments on other shifts that I may have missed?