Setting up Galaxy 15.10 or later
These instructions have been tested with Galaxy version 15.10 & 16.01
- Install and configure a Postgresql server.
-
Clone the
galaxy
repository that is currently in use by CloudMan. Make sure to choose the right release (See https://wiki.galaxyproject.org/Develop/SourceCode):git clone https://github.com/galaxyproject/galaxy git checkout release_16.01
###Note:
To update an existing galaxy instance simply
git checkout release_XX.XX
and thensh manage_db.sh upgrade
CloudMan currently (Mar 28, 2016) requires release_16.01.
-
Create a PostgreSQL database for use with Galaxy:
$ createdb galaxy
-
Create a user account for the
galaxy
database:$ psql galaxy CREATE USER galaxy WITH PASSWORD 'galaxy'; GRANT ALL PRIVILEGES ON DATABASE "galaxy" to galaxy;
As of commit: https://github.com/parklab/refinery-platform/commit/d9d6d81b8d834daff5763d9392fe6f85d2be0f25
there is a setting: REFINERY_GALAXY_ANALYSIS_CLEANUP
that allows you to specify how/when galaxy libraries, histories, and workflows should be deleted/purged.
The default option is: "ON_SUCCESS"
which will perform deletion upon a successful Analysis.
Other available options are: "ALWAYS"
and "NEVER"
.
The following steps assume that you are in your Galaxy's root directory
-
Create a directory named
tool-dependencies
.$ mkdir tool-dependencies
-
Copy the sample configs:
cp config/galaxy.ini.sample config/galaxy.ini
cp config/tool_sheds_conf.xml.sample config/tool_sheds_conf.xml
-
Open
galaxy.ini
in a text editor. -
Locate
database_connection
, uncomment the line and setdatabase_connection = postgres://galaxy:galaxy@localhost:5432/galaxy
. -
Locate
tool_dependency_dir
, uncomment the line and settool_dependency_dir = <BASEPATH>/galaxy/tool-dependencies
, where<BASEPATH>
is the absolute path to yourgalaxy
directory. -
Locate
host
, uncomment the line and sethost = 0.0.0.0
. -
If the purging of histories is desired, locate
allow_user_dataset_purge
uncomment the line and setallow_user_dataset_purge
=True
-
Optional, to help with debugging job failures:
cleanup_job = onsuccess
- Run
sh run.sh
in the root directory of thegalaxy
repository. - Go to
http://127.0.0.1:8080
and create a new Galaxy user account usingadmin@example.com
as email address andtest123
as password. - Go back to
galaxy.ini
, locateadmin_users
, uncomment the line and setadmin_users = admin@example.com
. - Restart Galaxy.
- Log into your Galaxy instance with your admin user account and create an API Key in the User menu.
- Uncomment the test toolshed located in:
config/tool_sheds_conf.xml.sample
- Copy
config/tool_sheds_conf.xml.sample
intoconfig/tool_sheds_conf.xml
- Open http://127.0.0.1:8080 with your web browser.
- Log in with your user and password.
- In the top menu bar click
Admin -> Search Tool Shed -> Galaxy test tool shed
see https://wiki.galaxyproject.org/Tool%20Shed) - Search for
refinery
and clickrefinery_test -> Preview and install -> Install to Galaxy
- Optionally specify a new category - e.g.
Refinery Platform
and hitInstall
- Click
Admin
in the top menu bar andManage Installed tools
- Click on 'refinery_test' then proceed to click on the test workflows and install each one by clicking on
Repository Actions
->import workflow to Galaxy
- When you switch back to the home view of Galaxy, you should now see your installed tools in the left navigation bar category as well as the uploaded or imported workflows at
Workflows
.
These instructions assume that the user has background knowledge in creating environment variables for their respective OS.
-
Optional: Create an environment variable called
REFINERY_VM_TRANSFER_DIR
and set it to a directory of your choice. This directory will be mapped into the VM and can be used to transfer data between the host and the VM and vice versa, e.g. for bulk import of datasets that are residing on the host into your Refinery instance. - Restart your VM with
vagrant reload
to make the above changes effective.
Pro-Tip: If you don't want to pollute your global environmental space, you can utilize virtualenv's postactivate hook ($VIRTUAL_ENV/bin/postactivate
).
NOTE: You will not need the port
8080
as shown in these instructions if you are configuring this for a cloudman instance. Simplyhttp://<master_node_private_ip>
will suffice
- Log into the Refinery Admin interface and create a new
Instance
in thegalaxy_connector
app:
Base URL: http://192.168.50.1:8080
Data url: datasets
Api url: api
Api key: <APIKEY>
Description: admin account
Troubleshooting: If you have problems importing workflows using ./manage.py import_workflows
because the importer doesn't seem to be able to connect to Galaxy then your gateway's address might be wrong. To find out what the actual address of your gate way is run netstat -rn
. This will print something like this:
Kernel IP routing table
Destination Gateway Genmask Flags MSS Window irtt Iface
0.0.0.0 10.0.2.2 0.0.0.0 UG 0 0 0 eth0
10.0.2.0 0.0.0.0 255.255.255.0 U 0 0 0 eth0
192.168.50.0 0.0.0.0 255.255.255.0 U 0 0 0 eth1
Look for the entry with the destination 0.0.0.0
and copy the Gateway address. Next you have to edit your galaxy_connector instance's base URL. For example http://192.168.50.1:8080
--> http://10.0.2.2:8080
.
Before you can import Workflows from a Galaxy installation into Refinery, the following requirements have to be met:
-
You have to add a Galaxy Instance for the Galaxy installation in question to Refinery through the admin UI.
-
You have to create a Workflow Engine for this Galaxy Instance using the
create_workflowengine
command, which requires a Galaxy Instance id and the name of a group that should own the workflow engine, e.g. “Public”. -
Galaxy Instance ids can be found here: http://192.168.50.50:8000/admin/galaxy_connector/instance/
$ manage.py create_workflowengine <instance_id> "<group_name>"
Alternatively, you can also create a workflow engine through the admin UI, in that case, however, you have to manually assign ownership to the managers of the group that should own the workflow engine.
- Now you are ready to import workflows into Refinery and execute them on data stored in the Refinery data repository.
- You will need to Annotate or Import your own workflows before being able to run an Analysis within Refinery.
Administration
- Operations
- Setting Up Galaxy
- Galaxy CloudMan
- Annotating & Importing Refinery Tools
- Batch Import ISA-Tabs
- Backup & Restore
- Google reCAPTCHA v2
Development