Modelling study

The aim of this study is to produce a website simulation in Eden. It will be a tool for modelling a website's structure and the characteristics of individual pages within the website. It will be able to simulate people or bots visiting the website and how changes to the site affect their behaviour.

The website will be represented visually as a graph where nodes represent pages and directed edges between them indicate the presence of a link from one page to another. The user will be able to select individual nodes (i.e. pages) with the mouse and modify their properties via various input fields. Page properties will include links to other pages, version and style of HTML, presence of JavaScript or CSS, file size and more.

Users will be able to build a model of a website interactively via a point and click interface. Whether they base their model on an existing website or make a fictional one is up to them. There are no plans to add a website and/or web server log file import facility to automatically build a representation of a site but such features could in theory be added by additional Eden scripts.

Once the model of the website has been built visitors and bots can be simulated on it. These agents will visit a page, and based on the page's properties may choose to follow a link to another page or leave. The visits that each page receives will be counted and displayed to the user. The hope is that using the simulation the user can easily identify weak points of the website (e.g.: Pages that receive little or no visits). If weak points are present the user may attempt to alleviate them by tweaking the properties pages that link to them and re-running the simulation to observe the effect of the changes.

If the model is based on a real website it is hoped that it can provide the user with an understanding of why parts of the website are under-performing and, through experimentation with the model, perhaps provide some solutions that can be applied to the real website.