You are here
Sharing Content Module
At the moment we have one drupal install to hold all our sites (see previous article). Whilst this allows for easy content sharing, it isn’t scalable in the grand scheme of things. So we’re planning to move to an Aegir hosting system to deploy multiple different sites in an easily managed way. However our problem comes in that we won’t then be able to share content between sites which is one of our primary requirements. Having looked around there doesn’t seem to be any module that fits our exact requirements, so a new custom module might be the way to go. What follows is a rough outline of how I see it working, but very happy to comments / suggestions from others as to how best tackle it.
- Provide a centralised content store of information
- Make that data available to external sources
- Allow external sources to query the content available and download content.
- Maintain a link between the downloaded content and the content store and keep the two in sync.
- We need to be able to recognise when content has been changed manually and not overwrite that when doing the sync.
- Information download should be automatic and manual
- Automatic: Set up parameters (taxonomy terms?) to download information automatically.
- Manual: Search for information and download
- BONUS FEATURE: Allow for syncing of content on a field by field basis. i.e. including/excluding certain fields as required.
Design the Custom Module
This is perhaps the easiest of the requirements to set up. Using CCK we can build content types for all our different types of information. We setup vocabularies and tag the information as required thereby creating a searchable database of content. We’d also want to use the UUID module so we have unique IDs for our content for the purposes of maintaining a link.
Content Store Services
The content store information needs to be made available externally, and the Services module provides an ideal way to do this. We’d need to setup the following services for use by the sync module.
- Retrieve taxonomy terms
- We want to be able to allow for the filtering of content based on taxonomy term. This may just be a user friendly way to cut down content, but we may use it to restrict searching of content to predefined areas.
- Retrieve content types
- We don’t want to download content for which we have no destination. So the resulting list would be filtered depending on which content types are available on the external install.
- Retrieve content summary
- Uuid, title, *content type, *taxonomy terms, created, *updated
- The items with * would be filter terms based on what had been selected in the GUI.
- Retrieve nodes
- Send a list of UUIDs and get back the content items.
There may be others that we’ll need, but for the time being that should be enough for our base requirements. As to how the data is presented to the receiver…I’m open to suggestions. I’d like to make them REST calls and return JSON, but that may change further down the development process.
This is the real meat to this project and what is going to take the most time. I’ll expand on this in the future hopefully, but for the sake of time, here’s a rough idea of how the manual import would work. For the sake of keeping it simple, we’ll assume everyone can see everything.
- Retrieve the list of taxonomy terms in order to create a combo box for the filtering of content.
- Retrieve the list of available content types and filter according to what we have available locally.
- Based on the current taxonomy and content type selection, retrieve a list of nodes from the content store, ordered by last updated date.
- Display form to user. Something along the lines of…
- User selects the content items that they want and press “Sync Selected”.
- UUIDs are requested from the content store and the information inserted into the local database.
- Additionally the UUIDs are inserted into a separate table for logging which UUIDs need checked to see if the content store has been updated.
- If a content item is changed locally, the UUID is removed from the table and the sync is broken.
For the automatic import, we’d probably need a screen like the following.
We’d need to set up a regular cron job that polled the content store to see whether any information needed to added/updated for both the manual and automatic versions.
There is a slight alternative to the above. Instead of the client having to poll the server, we instead use the pubsubhubbub protocol to have information automatically pushed out.
As you can see, it’s a fairly complex module. I’d be more than happy to hear from anyone who thinks there are modules out there that can do this already. With so many modules available it’s easy to overlook the obvious.