Skip to content

elixir-luxembourg/data-catalog

Repository files navigation

Data Catalogue

A tool for advertising bio-medical projects and their associated datasets

The catalogue enhances visibility and accessibility of biomedical datasets, fostering collaboration and accelerating research. It supports data sharing, compliance, and reproducibility across the research ecosystem.

Developed through lean, user-centered design, the catalogue integrates community-accepted metadata models and supports diverse data types. It continues to grow through contributions from ELIXIR-LU and partner projects.

Instances:

This software is behind the following instances:

Acknowledgement

Initially launched as the Translational Data Catalogue under IMI and H2020 initiatives jointly developed through ELIXIR Luxembourg, IMI-FAIRplus and IMI-eTRIKS collaborations. Over time, its development and support have evolved into a broader service now provided by ELIXIR Luxembourg.

License

The code is available under AGPL-3.0 license.

Table of content

Local installation

Local installation of development environment and procedure for docker version are described below.

Requirements

Python ≥ 3.10 Solr ≥ 8.2
npm ≥ 7.5.6

For Ubuntu

sudo apt-get install libsasl2-dev libldap2-dev libssl-dev

Procedure

  1. Install python requirements with:

    python -m pip install .
    
  2. The less compiler needs to be installed to generate the css files.

    sudo npm install less -g
    
  3. Create the setting.py file by copying the template:

    cp datacatalog/settings.py.template datacatalog/settings.py
    
  4. Modify the setting file (in datacatalog folder) according to your local environment. The SECRET_KEY parameter needs to be filled with a random string. For maximum security, generate it using python:

    import os
    os.urandom(24)
  5. Install the npm dependencies with:

    cd datacatalog/static/vendor
    npm ci
    npm run build
  6. Create a solr core

    $SOLR_INSTALLATION_FOLDER/bin/solr start
    $SOLR_INSTALLATION_FOLDER/bin/solr create_core -c datacatalog
  7. Back to the application folder, build the assets:

    flask assets build
    
  8. Initialize the solr schema:

    flask indexer init
    
  9. Index the provided studies, projects and datasets. For local development, change JSON_FILE_PATH from 'data/imi_projects'to 'tests/data/imi_projects_test' or use data from dats-elixir-files.

    flask import entities Dats study
    flask import entities Dats project
    flask import entities Dats dataset
    
  10. [Optional] Automatically generate sitemap while indexing the datasets:

    flask import entities Dats study --sitemap
    flask import entities Dats project --sitemap
    flask import entities Dats dataset --sitemap
    
  11. Generate Sitemap:

    flask generate_sitemaps
    
  12. [Optional] Extend Index for studies, projects and datasets:

    flask indexer extend project
    flask indexer extend study
    flask indexer extend dataset
    
  13. [Optional] Drop connector entities - removes connector entities from solr:

    flask indexer drop_connector_entities Daisy dataset
    
  14. [Optional] Customize the About and Help pages to relect your services.

  15. Run the development server:

    flask run
    

The application should now be available under http://localhost:5000

Testing

To run the unit tests:

pytest --cov .

Note that a different core is used for tests and will have to be created. By default, it should be called datacatalog_test.

Docker-compose build

Thanks to docker-compose, it is possible to easily manage all the components (solr and web server) required to run the application.

Requirements for docker-compose build

Docker and git must be installed.

Building

(local) and (web container) indicate context of execution.

  1. First, generate the certificates that will be used to enable HTTPS in reverse proxy. To do so, change directory to docker/nginx/ and execute generate_keys.sh (relies on OpenSSL). If you don't plan to use HTTPS or just want to see demo running, you can skip this (warning - it would cause the HTTPS connection to be unsafe!).

  2. Then, copy datacatalog/settings.py.template to datacatalog/settings.py. Edit the settings.py file to add a random string of characters in SECRET_KEY. For maximum security use:

    import os
    os.urandom(24)
    

    in python to generate this key.

    Then build and start the dockers containers by running:

    (local) $ docker-compose up --build
    

    That will create a container with datacatalog web application, and a container for solr (the data will be persisted between runs).

  3. Then, to create solr cores, execute in another console:

    (local) $ docker-compose exec solr solr create_core -c datacatalog
    (local) $ docker-compose exec solr solr create_core -c datacatalog_test
    
    
  4. Then, to fill solr data:

    (local) $ docker-compose exec web /bin/bash
    (web container) $ flask indexer init
    (web container) $ flask import entities Dats study
    (web container) $ flask import entities Dats project
    (web container) $ flask import entities Dats dataset
    
    (PRESS CTRL+D or type: "exit" to exit)
    
  5. The web application should now be available with loaded data via http://localhost and https://localhost with ssl connection (beware that most browsers display a warning or block self-signed certificates)

Maintenance of docker-compose

Docker container keeps the application in the state that it has been when it was built. Therefore, if you change any files in the project, in order to see changes in application the container has to be rebuilt:

docker-compose up --build

If you wanted to delete solr data, you need to run (that will remove any persisted data - you must redo solr create_core):

docker-compose down --volumes

Modifying the datasets

The datasets, projects and studies are all defined in the files located in the folder data/imi_projects. Those files can me modified to add, delete and modify those entities. After saving the files, rebuild and restart docker-compose with:

CTLR+D

to stop all the containers

docker-compose up --build

to rebuild and restart the containers

(local) $ docker-compose exec web /bin/bash
(web container) $ flask import entities Dats study 
(web container) $ flask import entities Dats project
(web container) $ flask import entities Dats dataset
 

(PRESS CTRL+D or type: "exit" to exit)

To reindex the entities

Single Docker deployment

In some cases, you might not want Solr and Nginx to run (for example if there are multiple instances of Data Catalog runnning). Then, simply use:

(local) $ docker build . -t "data-catalog"
(local) $ docker run --name data-catalog --entrypoint "gunicorn" -p 5000:5000 -t data-catalog -t 600 -w 2 datacatalog:app --bind 0.0.0.0:5000

Development

Install needed dependencies with:

pip install .[testing]

Configure pre-commit hook for black and flake8:
see https://dev.to/m1yag1/how-to-setup-your-project-with-pre-commit-black-and-flake8-183k

About

No description, website, or topics provided.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors 5