//How to install 3d Brain Atlas Reconstructor on Ubuntu?// = 3d Brain Atlas Reconstructor Installation (Ubuntu) = ---- **Note**: This procedure is valid for //Ubuntu 9.04//, //Ubuntu 10.04 LTS//, //Ubuntu 10.10// and //Ubuntu 11.04//. Installation on other Ubuntu versions or other Linux distributions is similar but the packages versions may be slightly different. ---- [[PageOutline(2-4,, inline)]] == Installing required packages == **Installation in Ubuntu 9.10** 1. Install the Visualization Toolkit and other graphics libraries: {{{ sudo apt-get install \ libvtk5.2 libvtk5-dev libvtk5.2-qt4 libvtk5-qt4-dev \ tk8.5 tk8.5-dev \ python-vtk libgtkgl2.0-1 libgtkgl2.0-dev libgtkglext1 librsvg2-2 python-nifti }}} 2. Install python related packages: {{{ sudo apt-get install \ python-gtkglext1 python-rsvg python-opengl python-numpy python-scipy python-wxgtk2.6 }}} 3. Other packages: {{{ sudo apt-get install \ potrace pstoedit python-setuptools python-epydoc }}} If you are a developer you may also want to install optional packages with documentation: {{{ sudo apt-get install vtkdata vtk-doc vtk-examples }}} **Installation in Ubuntu 10.04** Install the following packages: {{{ sudo apt-get install \ libvtk5.2 libvtk5-dev libvtk5.2-qt4 libvtk5-qt4-dev \ tk8.5 tk8.5-dev \ python-vtk libgtkgl2.0-1 libgtkgl2.0-dev libgtkglext1 librsvg2-2 python-nifti }}} {{{ sudo apt-get install \ python-gtkglext1 python-rsvg python-opengl python-numpy python-scipy python-wxgtk2.6 }}} {{{ sudo apt-get install \ potrace pstoedit python-setuptools python-epydoc }}} **Installation in Ubuntu 10.10 and Ubuntu 11.04** Install the following packages: {{{ sudo apt-get install \ libvtk5.4 libvtk5-dev libvtk5.4-qt4 libvtk5-qt4-dev \ tk8.5 tk8.5-dev \ python-vtk libgtkgl2.0-1 libgtkgl2.0-dev libgtkglext1 librsvg2-2 python-nifti }}} {{{ sudo apt-get install \ python-gtkglext1 python-rsvg python-opengl python-numpy python-scipy python-wxgtk2.8 }}} {{{ sudo apt-get install \ potrace pstoedit python-setuptools python-epydoc }}} Once all the packages are installed it is time to create the directory structure. == Getting the code == It is assumed that the main directory dedicated for 3dBAR is `/home/$USERNAME/3dbar`. if you want to install it in another directory, replace `3dbar` with the desired path. To get the latest stable version of 3dBAR fill out [http://service.3dbar.org/downloadForm the following form] then download 3dBAR using the link provided via email. Unzip the file to your home directory and go to the 3dBAR directory: {{{ mkdir ~/3dbar; unzip 3dbar_latest.zip -d ~/3dbar ; cd ~/3dbar; }}} Created directories have the following purposes: * **bin**: Holds all executable files, atlas parsers and auxiliary scripts * **lib**: Holds the 3dBAR api * **atlases**: Directory, where the source data, //CAF datasets// and reconstructed models are stored. Each dataset (denoted as DATASET_NAME) contains the following subdirectories: * atlases/DATASET_NAME/src : Here the source data is located. It can be placed manually by a user or downloaded from internet depending on a particular parser. * atlases/DATASET_NAME/caf : This is the directory where a CAF dataset is generated by specific parsers. * atlases/DATASET_NAME/reconstructions : The directory for reconstructed models. == Initial build == === Documentation === In order to generate documentation execute: {{{ make doc }}} The documentation for API can be viewed by opening '~/3dbar/doc/api/html/index.html' and the documentation for 3dBAR graphic interface can be viewed by opening '~/3dbar/doc/api/html/index.html'. === CAF datasets === ==== sba_DB08 ==== In order to generate CAF dataset sba_DB08 execute: {{{ source setbarenv.sh make sba_DB08 }}} Generated dataset can be found in '~/3dbar/atlases/sba_DB08/caf/' directory. ==== sba_PHT00 ==== In order to generate CAF dataset sba_PHT00 execute: {{{ source setbarenv.sh make sba_PHT00 }}} Generated dataset can be found in '~/3dbar/atlases/sba_PHT00/caf/' directory. ==== sba_WHS09 ==== In order to generate CAF dataset sba_WHS09 execute: {{{ source setbarenv.sh make sba_WHS09 }}} Generated dataset can be found in '~/3dbar/atlases/sba_WHS09/caf/' directory. ==== sba_WHS10 ==== In order to generate CAF dataset sba_WHS10 execute: {{{ source setbarenv.sh make sba_WHS10 }}} Generated dataset can be found in '~/3dbar/atlases/sba_WHS10/caf/' directory. ==== sba_RM_on_F99 ==== In order to generate CAF dataset sba_RM_on_F99 execute: {{{ source setbarenv.sh make sba_RM_on_F99 }}} Generated dataset can be found in '~/3dbar/atlases/sba_RM_on_F99/caf/' directory. ==== sba_LPBA40_on_SRI24 ==== In order to generate CAF dataset sba_LPBA40_on_SRI24 execute: {{{ source setbarenv.sh make sba_LPBA40_on_SRI24 }}} Generated dataset can be found in '~/3dbar/atlases/sba_LPBA40_on_SRI24/caf/' directory. ==== whs_0.5 ==== In order to generate CAF dataset whs_0.5 execute: {{{ source setbarenv.sh make whs_0.5 }}} Generated dataset can be found in '~/3dbar/atlases/whs_0.5/caf/' directory. ==== whs_0.51 ==== In order to generate CAF dataset whs_0.51 execute: {{{ source setbarenv.sh make whs_0.51 }}} Generated dataset can be found in '~/3dbar/atlases/whs_0.51/caf/' directory. == Testing the 3dBAR GUI == Once you have a CAF dataset created you can test the GUI for structure creation. To do it in '~/3dbar' directory run: {{{ ./3dbar.sh }}} and choose in the menu //Atlas/Open// and select 'index.xml' file of chosen CAF dataset. To test, click the topmost label on the tree in the left panel and press 'Perform reconstruction' button in the right panel. The reconstruction process will start. When it is finished, choose in the menu //Edit/Save Model//. It allows you to put it later in context by right click on the ontology tree. {{{#!comment == Initial build == In order create initial CAF datasets, generate documentation use following command in /home/$USERNAME/3dbar/ directory: {{{ make -B -j N all }}} where N is number of parallel processes You want to use. If everything is installed correctly processing should be performed without any errors. Then 3dBAR GUI should be launched {{{ ./3dbar.sh }}} and used to perform reconstructions. If everything went fine, You may proceed to : == Getting parsers for additional datasets == }}}