Whatever tools we have available for making sitemap diagrams and mapping architecture right now, just don’t seem to quite inspire. At least, that’s how I’ve been feeling. How do you take the site architecture design process to a more inspirational level through more creative visualisation?
Hoping to find an answer, and some sweet inspiration, I started checking out different ways to visualise and generate sitemaps on the interwebs. Along my travels I’ve found some really interesting site map diagrams and I’ve been looking at various methods to document site maps, using commonly (and less commonly) available tools.
Why document a site architecture diagram?
A nicely documented site architecture diagram can give you a complete enough view of your current site map for you to identify any potential problems with page rank flow, pages with very low levels of internal links and content too many clicks away from your homepage. It can also make you think differently, and (for me anyways) anything that inspires me to think differently about something I’ve done the same way for a long time is a good thing. By creating a diagram of your site, you can solve existing problems, add links to flatten your site architecture and you can always use your work later on to plan for a better site redesign.
One of the first tools I came across was Graphviz, a piece of software (with its own language) designed to represent “structural information as diagrams of abstract graphs and networks”. Bottom line is, if you can get your head round the language, “DOT”, then you should be able to produce some amazing architecture diagrams like this (click to enlarge):
I really liked the way the deeper content surrounds the homepage at the centre, rather that the homepage sitting at the top of the diagram. Pagerank doesn’t flow “down” your site architecture, it’s distributed around it. Cool.
Using Graphviz is rock hard though – you’ll need some patience to produce something like the diagram above. There’s a lot of messing around learning DOT, which is a very simple language but requires some further code driven tomfoolery to get the design / layout of the visual just right. To make life apparently easier, Xenu’s link sleuth supports an export file type compatible with Graphviz, thanks to contributor Kevin Niehage but, every export I did ended with hilarious (crash related) results.
Perhaps a little less exciting, but far simpler is Microsoft Visio. Visio actually crawls your site, and generates a site map, just like that! Admittedly, the first results tend to be a bit useless, but with the discovery of the list view toolbar, you’re set:
The process of mapping a web sitemap in Visio is simple – here’s the step by step:
1) Open Visio and select networks > web site map – the dialogue that appears looks like this:
2) Click ok (there are some settings to play with, but let’s not cover those – you don’t need to change much, if anything):
On sites of a reasonable size, you can produce something visually appealing for your clients quite quickly.
Some of the most impressive visualisation tools I found out there seem not to exist any more – Nicheworks being a good example. According to this site:
Nicheworks is a interactive tool for visualising massive networks with hundreds of thousands of nodes. It was developed by Graham Wills at Bell Labs. The screen-shots here show Nicheworks visualisation of the network structure of a large Web site.
Check out the graphics produced by Nicheworks (Click to enlarge):
By far the coolest visualisation tool of them all (and I really, really hope it gets some more development one day) is WebTracer2. According to their homepage:
WebTracer is an project based on the intention to visualise the structure of the web. There are many applications that analyse websites for structural integrity and diagnostic purposes, but few reveal the visual structure that web hypertext creates. Webtracer represents this structure as a three dimensional molecular diagram, with pages as nodes(atoms) and links as the strings(atomic forces) that connect those nodes together.
Check out these visuals (click to enlarge):
These images are not static! In fact, they’re interactive. Webtracer is a really exciting piece of visualisation software, so how do you make a pretty picture like this?
The way the tool works is incredibly simple. Download the WebTracer2 ZIP file and extract it to a directory somewhere on your PC. Open the “Spider” application and enter your URL, like this:
The spider sets about crawling, ready to be told when to save its data in the /Maps subdirectory. The crawl isn’t too aggressive – one or two pages per second is quite reasonable. I’d love to take the crawler for a spin on one of the big sites I work with when I have the time to allow for the crawl.
Once you’ve crawled your site, click the “Save Current Trace” and go back to the directory you extracted the ZIP file to. In there you’ll find another executable called “Visualiser”. Run the EXE and follow the on screen prompts to find your newly created map file. Here’s I one made after crawling our site, recorded with Camtasia and uploaded to Youtube:
The two, double helix like chains of spheres running through the middle of the space are pages linked to frequently, typically navigational elements. There are other, smaller chains of spheres which seem to be the most recent posts. From there, the spheres (nodes) decrease in size the less often they are linked to. Typically the least linked to nodes sit on the outermost periphery of the space. These are the pages / posts that I don’t mention very often (if at all) so most likely they’re liked to from one or two category pages at the most.
I really like the way you can mouse over the different spheres to find out their URL – great for exploration and a good way to kill some time, too.
Other (surprising) inspirational sources: Flickr
photo credit: netfuel
Peter’s favourite “hard to beat” method of sitemap creation was simply to put your pages up on the wall:
photo credit: jimbola
There’s much inspiration left to discover
Just by looking around the internet and discovering some of these applications, I quickly began to realise the gap in the market for an upto date architecture visualisation tool. Imagine the power of a tool that could crawl and recrawl your site, identifying, visually, areas of your architecture that need work from the perspective of pagerank flow. Extra points for anyone who can build chronological crawl data to check for orphaned content – still a huge problem with large scale dynamic sites. Inspiring stuff.