NGSView runs under Linux (developed and tested on Fedora, Ubuntu,
Debian, openSUSE and CentOS 32- and 64-bit platforms) and requires Qt (version 3) and the Berkeley
DB. NGSView works with Berkeley DB version 4.3 and later
versions. The latest stable version of Qt 3 can be downloaded
If Berkeley DB is not present on the system and needs to be built, the "--enable-cxx" flag must be used when running the configure script in order for NGSView to be able to use BDB.
On some systems, it may be required to install X11 development headers in order for NGSView (and/or Qt3) to compile. Here are some examples of installs that were recently (July 2009) required on some common systems:
yum groupinstall "X Software Development"
yum install libXinerama-devel
apt-get install libx11-dev
apt-get install libxext-dev
zypper install xorg-x11-devel
The speed and low memory usage of NGSView comes at a price: additional time spent constructing the database at the time of initial loading of data into the software. However, the second time the same project is opened for visualization, the alignments open instantaneously, whereas the loading time and memory usage stay constant for non-DB based tools. This is illustrated in the following table:
|First open, time||59 min||12 min 30 sec||16 min|
|Second open, time||1 sec||12 min 30 sec||16 min|
|First open, RAM||5.1 GB*||16.8 GB||10.5 GB|
|Second open, RAM||185 MB**||16.8 GB||10.5 GB|
The above benchmarks were run on a Quad Core Xeon X7350 2.9GHz computer with 128GB RAM running CentOS 5, on a C. elegans dataset consisting of 19,566,095 Illumina reads (Chromosome V assembly, obtained from the EagleView website).
* Database cache size set to 4GB.
** Default database cache size used (100 MB).
Initial load time mainly depends on cache size, disk speed, and filesystem -- ext4 is much faster than ext3, which is why NGSView load time is shorter on a default Fedora 11 system (ext4 by default) than e.g. a Kubuntu 9.04 system (ext3 by default). It is possible to load large datasets on systems with low RAM, but then at the cost of longer load times. The Berkeley Database files work across platforms, so if loading speed is an issue it is also possible to import the data on a machine with better memory resources and then transfer the database files to a less resourceful computer for visualization.
Depending on system and dataset size, some experimenting with cache size may be necessary for optimal load times (see manual for details on how to set the cache). It is generally a good idea to set the cache as high as the system permits when initially loading a new data set. On subsequent runs of the program for the same data, it is generally enough to use the default cache size (100 MB) or lower.