ABSTRACT: today it is unacceptable to have any application built as an independent islands of functionality. It is necessary to expose functionality via simple protocols (http).
Really, it is unacceptable today to have any application built as an independent piece of functionality. No application is or should be complex enough to do EVERYTHING. Thus, since we do not view our users as the mandatory human servers of the omnipotent computer -- with the noble lifetime goal of clicking through the menus, dialogs, wizards and buttons that the all-mighty programmer has imposed on them -- all application components NEED TO OFFER API-based access to a basic set of services/functions/objects/whathaveyou.
These span a large number of functional areas, from controlling the application remotely, scripting its behavior, input/output/logging etc, so we'll take them one-by-one in future posts.
Point being that applications become interoperable and subject to automation. There's no other way really to automatically turn down the volume of the DVD player because you answered a call on skype! OR have a tweet on your cellphone when the torrent has finished donwloading the latest CBC show.
--- Interoperability == standardisation?
This thinking is normal to any Unix hacker, since they allways did "find|grep|cut|wc|sort" but it seems too complicated for anybody else to comprehend, especially window-heads.
With physical devices, standardisation is, granted, not easy. Standards arise from a need which is already fulfilled by existing devices, cables and plugs. Then the plugs get standardised and devices become interoperable.
The same was true in the software world (find|grep), but it's been false for a long time, especially since the advent of the stupidifying mouse and the un-parseable pixel.
Command lines have always been around and http has also been around for a long time. There really is no reason not to combine both, when all you want is users to use the functionality your code has to offer.
Ideally, all applications would be certified as "open" and "interoperable" and we will get there, in time. DLNA/UPNP is an example of such an interoperable framework, including certification. So is OSS/J (a set of telecom APIs) and others.
--- using http at rest to describe and interact with the object-oriented world
So, what should ideally happen? We'd have unified protocols for access and interoperability. HTTP, telnet, command lines come to mind. Formats are also an obvious need: HTML/XML/JSON/text. Let REST dictate how things work, but only a modicum of human intervention can be mandated.
Beyond the needs, there's the wants. We'd like that each would expose their internal models (objects, services, methods, actions, functionality) in a standard way, parseable and understandable by others.
If you extend that to the interoperable web, you get the semantic web. Well, almost, since that's been designed by DBAs - they'd like to call it the "data web" and hide logic in a view that offers more data :). Sorry, couldn't help it!
I live in a world surrounded by objects I'm interacting with. Data/information has a very important role to play, but it's not the end-all. My newspaper can not only filter for me the latest developments on that accident, but can also manage my account and micro-payments. Having knowledge of an intermediary PayPal only serves to confuse me and keep my mind busy with concepts I don't care about, since i already told my "gate keeper" that I trust my newspaper somewhat...
The semantic web guys have it right, though. Exposing standardized schemas of the applications's objects and functionalities is the way to go. Scoping these (my "train" is different than your "train") makes obvious sense (think namespaces).
---- So, vat do you vant?
In the B2B universe (now turned SOA) there's lots of WS-based standards, including security, identity exchange etc. These don't bode well on the web, though. Why write things twice? It has to be simple-over-http. PERIOD, dumby.
So, each modern application has:
- embedded http server
- exposing functionality/internal models etc
- designed to be interoperable
I would like everyone to stop buying and using any application that doesn't meet these criteria. Harsh, but then i'm in a bad mood today...
------ So, what's a developer to do?
Embed an http server. In case you need a small, lightweight embedded web server, checkout my public project at http://razpub.googlecode.com . There's lots of others out there - actually I recommend you get one that supports the servlet standard and you do servlets.
Think about and define your business model, spell out your domain entities. Use less services and more objects when defining your API and bode nicely with REST.
Define your model in whatever format you want - just make sure it's an xml file :). We'll deal with these in a future post, but for now it must be objcet-oriented: each 'class' has 'attributes' and 'methods'. The methods have a name and a list of arguments. Keep all types to String for now...assume all interaction is via URLs, which can't marshal bytecode - that's something we'll have to deal with.
Offer access to the entire business model via http, including content (values) and control (invoked methods).
Document all this very nicely in an embedded set of html pages. No smarts neccessary. Simple solutions always work better than complex ones.
If you're looking for an asset/modelling/http object interface framework, checkout the com.razie.pub.assets package of my razpub project.
Stick to the REST principles for now, until my "REST is bad" post is posted.
Modularity - allow extensions of functionality. Since the future is OSGi, may I suggest an OSGi compliant server - mine will be but it's not.
And generally, don't forget to have fun!
P.S. Just to give a concrete example, the VLC player is a great one. It has several interfaces, including telnet and http and you start whichever you want...