Skip to content

Commit

Permalink
Initial commit
Browse files Browse the repository at this point in the history
  • Loading branch information
DavisRayM committed Apr 23, 2021
0 parents commit b9ecb74
Show file tree
Hide file tree
Showing 66 changed files with 4,098 additions and 0 deletions.
4 changes: 4 additions & 0 deletions .coveragerc
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
[run]
omit =
# Omit any tests
*/tests*
7 changes: 7 additions & 0 deletions .flake8
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
[flake8]
extend-ignore = E203, E266, E501
# line length is intentionally set to 80 here because black uses Bugbear
# See https://github.com/psf/black/blob/master/docs/the_black_code_style.md#line-length for more details
max-line-length = 80
max-complexity = 18
select = B,C,E,F,W,T4,B9
48 changes: 48 additions & 0 deletions .github/workflows/docker-hub-image-build.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
name: Build image for Docker Hub

on:
release:
types:
- "released"

jobs:
main:
runs-on: ubuntu-20.04
steps:
- name: Checkout
uses: actions/checkout@v2

- name: Set up QEMU
uses: docker/setup-qemu-action@v1

- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v1

- name: Get the version
id: get_version
run: echo ::set-output name=VERSION::${GITHUB_REF/refs\/tags\//}

- name: Login to DockerHub
uses: docker/login-action@v1
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}

- name: Build and push
id: docker_build
uses: docker/build-push-action@v2
with:
context: .
file: ./Dockerfile.prod
platforms: linux/amd64
build-args: |
release_version=${{ steps.get_version.outputs.VERSION }}
push: true
cache-from: type=local,src=/tmp/.buildx-cache
cache-to: type=local,dest=/tmp/.buildx-cache
tags: |
onaio/duva:latest
onaio/duva:${{ steps.get_version.outputs.VERSION }}
- name: Image digest
run: echo ${{ steps.docker_build.outputs.digest }}
10 changes: 10 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
__pycache__/
media/*
.vscode/
hyperd*.log
.DS_Store
*.db
MANIFEST
.tox/
.pytest_cache/
.coverage
9 changes: 9 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
repos:
- repo: https://github.com/psf/black
rev: 20.8b1
hooks:
- id: black
- repo: https://gitlab.com/pycqa/flake8
rev: 3.8.4
hooks:
- id: flake8
19 changes: 19 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
sudo: required
dist: focal
language: python
jobs:
include:
- python: 3.7
env: TOXENV=py37
- python: 3.8
env: TOXENV=py38
- python: 3.7
env: TOXENV=lint
services:
- redis-server
install:
- pip install -U pip
- pip install tox
script: tox
notifications:
slack: onaio:snkNXgprD498qQv4DgRREKJF
6 changes: 6 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
FROM tiangolo/uvicorn-gunicorn-fastapi:python3.7

RUN mkdir -p /root/.aws
COPY . /app

RUN mkdir -p /app/media && pip install --no-cache-dir -r /app/requirements.pip
20 changes: 20 additions & 0 deletions Dockerfile.prod
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
FROM tiangolo/uvicorn-gunicorn-fastapi:python3.7
ARG release_version=v0.0.1

# Create application user
RUN useradd -m duva

# Create directory for AWS Configurations
RUN mkdir -p /home/duva/.aws

# Clone Duva application source code
RUN git clone -b ${release_version} https://github.com/onaio/duva.git /app-cloned &&\
mv -f /app-cloned/* /app &&\
chown -R duva:duva /app

# Install application requirements
RUN pip install --no-cache-dir -U pip && pip install --no-cache-dir -r /app/requirements.pip

EXPOSE 8000

CMD ["/start.sh"]
86 changes: 86 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
# Duva

[![Build Status](https://travis-ci.com/onaio/duva.svg?branch=main)](https://travis-ci.com/github/onaio/duva)

Duva is an API built using the [FastAPI](https://github.com/tiangolo/fastapi) framework that provides functionality to create & periodically update Tableau [Hyper](https://www.tableau.com/products/new-features/hyper) databases from CSV files. Currently the application supports connection to an [OnaData](https://github.com/onaio/onadata) server from which it'll pull data from an XLSForm and periodically export to a Tableau Hyper database

## Requirements

- Python 3.6+
- Redis

## Installation

### Via Docker

The application comes with a `docker-compose.yml` file to facilitate easier installation of the project. _Note: The `docker-compose.yml` file is tailored for development environments_

To start up the application via [Docker](https://www.docker.com/products/docker-desktop) run the `docker-compose up` command.

### Alternative Installation

1. Clone repository

```sh
$ git clone https://github.com/onaio/duva.git
```

2. Create & start [a virtual environment](https://virtualenv.pypa.io/en/latest/installation.html) to install dependencies

```sh
$ virtualenv duva
$ source duva/bin/activate
```

3. Install base dependencies

```sh
$ pip install -r requirements.pip
```

4. (Optional: For developer environments) Install development dependencies.

```sh
$ pip install -r dev-requirements.pip
```

At this point the application can be started. _Note: Ensure the redis server has been started_

```
$ ./scripts/start.sh
```

## Configuration

The application can be configured either by manual editing of the `app/settings.py` file or via environment variables i.e `export APP_NAME="Duva"`. More information on this [here](https://fastapi.tiangolo.com/advanced/settings)

## API Documentation

Documentation on the API endpoints provided by the application can be accessed by first running the application and accessing the `/docs` route.

## Testing

This project utilizes [tox](https://tox.readthedocs.io/en/latest/) for testing. In order to run the test suite within this project run the following commands:

```
$ pip install tox
$ tox
```

Alternatively, if you'd like to test the application with only the python version currently installed in your computer follow these steps:

1. Install the developer dependencies

```sh
$ pip install -r dev-requirements
```

2. Run the test suite using [pytest](https://docs.pytest.org/en/stable/)

```sh
$ ./scripts/run-tests.sh
```
>> OR
```sh
$ PYTHONPATH=. pytest -s app/tests
```
43 changes: 43 additions & 0 deletions RELEASE_NOTES.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# Release Notes

All release notes for this project will be documented in this file; this project follows [Semantic Versioning](https://semver.org/).

## v0.0.1 - 2021-03-15

This is the first release :confetti_ball:.

Project Breakdown: Duva is RESTful API that allows users to easily create & manage [Tableau Hyper](https://www.tableau.com/products/new-features/hyper) databases.

### Key Features as of v0.0.1:

- Supports automatic creation and updates of Hyper databases from an [OnaData](https://github.com/onaio/onadata) server; The application utilizes OnaDatas Export functionality to create and update the database.
- Supports creation of Hyper databases from a CSV File.

### Sample Flows:

#### One-off Hyper database creation from CSV File:

The application as mentioned above supports creation of a one-time Hyper database from a CSV File; These databases are not updated after creation.

![one-off hyper database creation](./docs/flow-diagrams/one-off-hyper-database-flow.png)

This flow is ideal for one-off hyper database or for Servers where automatic creation & updates are not supported. *NB: As of v0.0.1 the application only supports OnaData servers.*

#### Automatic creation and updates of Hyper Databases for OnaData servers

In order for one to use this flow with a desired server, the user has to first register a new `Server` object. Which will be used to authenticate the application and users; allowing the application to pull data on behalf of the user on a scheduled basis in order to update the managed Hyper database.

Server registration flow(One-time flow for new servers):

![server registration flow](./docs/flow-diagrams/server-registration-flow.png)

After a new server is registered users from the registered server are now able to create
managed Hyper database files.

![managed hyper datase flow](./docs/flow-diagrams/managed-hyper-database-flow.png)

*During the creation of the managed hyper database, users can specify a Tableau server where the hyper database should be published too after every update of the hyper database. For more information on how to configure this please view the API Docs on a deployed instance of the application(/docs).*

### Known Limitations of v0.0.1

- The application currently uses session cookies to authenticate users; there are plans to phase out session cookies in favor of API Tokens. As of now users may need to clear the cookies in order to unauthenticate.
82 changes: 82 additions & 0 deletions alembic.ini
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
# A generic, single database configuration.

[alembic]
# path to migration scripts
script_location = app/alembic

# template used to generate migration files
# file_template = %%(rev)s_%%(slug)s

# timezone to use when rendering the date
# within the migration file as well as the filename.
# string value is passed to dateutil.tz.gettz()
# leave blank for localtime
# timezone =

# max length of characters to apply to the
# "slug" field
# truncate_slug_length = 40

# set to 'true' to run the environment during
# the 'revision' command, regardless of autogenerate
# revision_environment = false

# set to 'true' to allow .pyc and .pyo files without
# a source .py file to be detected as revisions in the
# versions/ directory
# sourceless = false

# version location specification; this defaults
# to app/alembic/versions. When using multiple version
# directories, initial revisions must be specified with --version-path
# version_locations = %(here)s/bar %(here)s/bat app/alembic/versions

# the output encoding used when revision files
# are written from script.py.mako
# output_encoding = utf-8

[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples

# format using "black" - use the console_scripts runner, against the "black" entrypoint
hooks=black
black.type=console_scripts
black.entrypoint=black
black.options=-l 79

# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic

[handlers]
keys = console

[formatters]
keys = generic

[logger_root]
level = WARN
handlers = console
qualname =

[logger_sqlalchemy]
level = WARN
handlers =
qualname = sqlalchemy.engine

[logger_alembic]
level = INFO
handlers =
qualname = alembic

[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic

[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S
Empty file added app/__init__.py
Empty file.
1 change: 1 addition & 0 deletions app/alembic/README
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Generic single-database configuration.
Loading

0 comments on commit b9ecb74

Please sign in to comment.