We've represented the dreams using what we have called an 'oneirogram', developed by David Patman and Noah Pedrini. This is a force diagram which uses AlchemyAPI's natural language processor to analyse themes and concepts in the dream text. 

We aggregate the text of all dreams for the selected parameters (date, location, tags) and determine thematic tags for the 'combined dream'. The combined dream tags are then compared with tags for the individual dreams. If a dream shares a tag with another dream, a line (or 'edge') is drawn between them. The more connections a dream or tag has, the larger its node is shown.

Node colours are associated with colours mentioned in the dreams. The pitch of each musical tone 'played' by an individual dream is proportionate to the dream's 'sentiment' score measured by AlchemyAPI. 

We've designed an algorithm to generate a 'collective dream' in real time (at the centre of the diagram), which is also analysed using Alchemy. Using the stored tags, we create a JSON file representing node-link relationships between all of the dreams. This data file is then used to drive the visualization.

The visualization itself is a force-directed layout built using the D3js javascript library, and the image export feature leverages the canvg library. A few of the other javascript libraries we use include jQuery, jQuery UI, and tipsy.