Spurred by the emergence of Internet-based messaging services, such as Twitter or the Facebook status update feature, the idea of activity streams has gained momentum recently.
[see link to Slideshare at the end of this post]
Activity streams capture the idea to bundle status updates from a wide variety of sources, human-generated messages (as in microblogging or social networking) as well as machine-generated updates such as software status updates (e.g. from ERP or project software), sensor data or other machinery that sends information relevant for decision makers. The idea is to bundle a lot of this information in one stream, available to whoever needs it – one integrated stream of real-time information relevant for decision making. The motivation obviously is to do away with the sheer abundance of systems and locations in which people find their information important to do their daily business.
“One stream to feed them, one stream to serve them, in real-time to inform them.”
But this is as far as the idea goes so far. Which leaves open far too many questions. Having one stream is good, but how do I decide what’s important for me in the abundance of information that speeds past my eye every split second? I don’t have to log onto many systems, granted. But at least I had my data in all those systems, what was relevant for me. Now, we have created one big stream. How do tap into this? Will I have to follow stuff, like on Twitter or Facebook? Make friends with the ERP system? But how? Who or what do I follow in order to receive whatever information is relevant for me coming from the various systems? Have we thought this through?
“One stream to overload them, one stream to blind them, in information to bury them.”
This is where we bring in the analogy of cross docking (X-docking), not so much to provide a solution to implement activity streams, but as a framework, which will help us to identify the areas which need more thought, more work, research and creative solutions.
What is X-Docking?
X-Docking is a retail industry concept, often referred to as the warehouse without inventory. The idea is to speed up distribution processes between manufacturers and retail outlets. Rather than having manufacturers delivering their merchandise in small quantities directly to the outlets (which is inefficient) or in large quantities to a warehouse where it is then stocked and then passed on to the outlets whenever they might need it (which is costly), the idea is that manufacturers deliver only what is needed on a day-to-day basis by the outlets and that distribution happens instantly. This is how it works:
- All manufacturers will deliver their products in full truck loads, during a predefined time period (e.g. the morning), to the X-Docking area (called the inbound delivery).
- The X-Docking area is a large warehouse (not with shelves but with an automated sorting system using transport belts) where items are not stored but sorted, so that full truck loads can leave to the outlets later on the same day.
- The outbound deliveries consequently contain a mix of products with whatever the respective outlet needs.
How does X-Docking relate to activity streams?
Actually, there are many analogies and the principle works quite good as a framework. First of all, the goals of X-Docking and Activity Streaming are much the same: 1) consolidate item flows from different sources, 2) centralise the flow of items in one place, 3) rather than store items, facilitate near-time distribution, and 4) ship items to a the right customers.
We can take the analogy further to analyse the structural analogies. In doing so, we will expose areas in which we will have to find answers to critical, open questions with regards to turning the idea of Activity Streaming into a workable design, which can be usefully applied to real-life scenarios:
- Sorting/streaming: In X-Docking, the sorting facility is an essential, defining part of the design, but is generally not where the difficulties for implementation lie. Similarly, the actual activity stream itself is important, but ultimately will not pose the main challenges for implementation. The problems lie at the two ends (inbound and outbound) of the system.
- Inbound delivery: In X-Docking all inbound items are already assigned to a customer (e.g. products are barcoded with information regarding the receiver). Similarly, in activity streams data items need to be enriched with meta information in order to enable distribution to the right receivers. This issue is critical and it is all but trivial to achieve, as we will see.
- Outbound delivery: In X-Docking the retail outlets only receive what they actually need. However, this is only achieved, if the outlets have the capability to identify and communicate in real-time their needs to the manufacturers who prepare the deliveries. Similarly, activity streaming will only work, if there is a way for users to articulate their information needs and for these to be translated and propagated through to the information sources.
Exposing challenges in achieving the Activity Streaming vision
The analogy proofs to be useful, however implementation will pose far greater challenges in Activity Streaming as in retail X-Docking. For example, while retail outlets today are quite well able to determine their product needs and to send orders (or inventory data) electronically to the various manufacturers, the game becomes way more complicated and messy in the activity streaming world. Here is a list of challenges that become onbious upon critical examination of the analogy:
- Determine information needs: Depending on the usage scenario (e.g. project management, process management, or even highly contingent scenarios such as disaster management) information needs can emerge instantly (e.g. a fire is reported), be very contextual (e.g. focused on a geographical area) and might change rapidly (the fire spreads, the scenario changes from fire containment to evacuation). Consequently, a mechanism is needed to populate such changing needs (and changing meta data requirements) to the data sources. But how can such information needs be determined or articulated by the users? Manually, automatically? In what way? With what devices? How are they passed to the sources?
- Information tagging and meta data: Inbound data items need to be enriched with meta data (tagging). Depending on the source (software, people on Twitter, sensors delivering status updates etc.) such mechanisms will likely be very different. How is this tagging going to work? How can tagging tie in with articulation of information needs? And will this be achieved in real time in time critical scenarios?
- Data filtering and contextual deliveries: Sorting the stream to derive outbound deliveries is a key activity in X-Docking. In Activity Streaming this means that users need an effective mechanism to filter the stream, which is likely to be massive, in order to be only delivered such data that is relevant, not just in general, but in a specific context (e.g. when evacuating a suburb the decision makers will have to have ready access to and focus only on information relevant to this task). How will filtering work? How does it tie in with tagging? The two are closely related, as the one will not work without the other.
- Heterogeneous nature of receivers: Unlike in X-Docking, what we will ultimately will call a receiver or user of information in Activity Streaming is likely to show much more variance. Sure, individual users are what we have in mind, but outbound streams might be organised for organisational roles, entire teams, divisions, projects or otherwise contexts. How will information needs be determined for these different user groups? How can such information be represented? Who decides on information needs?
- Heterogeneous nature of senders: Again, the complexity is likely to much higher than in X-Docking, since all sorts of sources can act as information sources. Major challenges here are 1) tapping into external, public sources (e.g. Twitter), which are third-party administered, and 2) making available data from legacy systems in reliable form. As for Twitter for example, in a disaster scenario, people will spontaneously start sharing valuable information (e.g. regarding fire spread) using hash-tags they invent, which will thus gain widespread use in a matter of minutes. An Activity Streaming solution needs a mechanism to pick up the emergence of such tags, if it wants to retrieve from Twitter and add to the stream relevant information related to the scenario (e.g. information valuable in determining evacuation corridors). How can a socio-technical solution facilitate such information access from external sources? Will learning algorithms facilitate streaming of such information?
- Integration with user environment: In retail X-Docking, once the products have been delivered to the supermarket, they are put on the shelves. That’s it. The use of information however is very different; it ties in with users’ work practices and so determining information needs, filtering, tagging, information consumption needs to become part of such work practices. And work practices are very different in the various contexts – office workers, consultants travelling, fire fighters in the field, decision making teams in a control room. How can activity streams be made available in such different scenarios? And with various devices? How will integration with existing software, application and work practices be achieved?
The above list is certainly not exhaustive, but intended to provide a first starting point (or many!) for discussion. Of course, many of these questions can only be answered contextually, as answers will be very different in different use scenarios. Moreover, since the concept requires holistic thinking and complementing competencies, only collaborative efforts between stakeholders – users, system developers, platform providers, and researchers (from both academia and industry) will yield the right results to facilitate the Activity Streaming vision.
What do you think? Leave your comments.
Get the Slideshare presentation on the topic as well.