Investment prototype fake api
This project provides a lightweight JSON API into a postgresql database
to allow the datahub investment prototype
to store data and search. For more details on the frontend, go here
You can see the API in action here
The project is a Node/Express js application that uses Knex JS to read
and write data to a Postgresql database and searches for data in an Elastic search index.
As such the application requires node, postgres and elasticsearch.
If you don’t want to install all the various componets you can simple use docker-compose to
bring up the service and then use a method such as ‘docker-compose container exec bash’ to
enter the container and run migrations etc.
To start the backend using docker compose make sure you have docker and docker compose
installed and then simply:
docker-compose up
This will:
You can now access the API via http://localhost:3010/
note If you make changes to the DB migration scheme or seed data files you will either
need to go into the api server container and use knex to apply these or destroy the docker
compose containers and start it up again.
If you wish to bring up the entire stack, including the front end component you should
checkout the frontend project into the same parent folder as the backend and then
start the stack from that folder, using the instructions in the frontend readme
If you have Node installed locally and want to run it that way then you must make sure
you have an elastic search instance running and a Postgresql database.
Clone repository:
git clone https://github.com/UKTradeInvestment/data-hub-invest-fe
Install node dependencies:
npm install
Run the server in either production mode or develop mode
Builds static assets and runs a server using node
npm run build
npm start
Server watches for changes and rebuilds sass or compiles js using webpack as needed. Changes to server side
code will result in the server autorestarting. The server will run with the node debug flag so you can
debug with Webstorm or Visual Studio Code
npm run develop
Examples of how to install docker containers without using docker compose
docker run --name zorg-postgres -p 5432:5432 -e POSTGRES_DB=investment -e POSTGRES_USER=datahub -e POSTGRES_PASSWORD=password -d postgres
docker run --name zorg-redis -p 6379:6379 -d redis
docker run --name zorg-elasticsearch -d -p 9200:9200 -p 9300:9300 --name zorg-elasticsearch jeanberu/elasticsearch-head
run the following command if you want to restart containers (i.e. if for instance you restart you’re machine)
docker restart zorg-postgres zorg-redis zorg-elasticsearch
After installing/restarting the containers you will need to run the following knex commands (you may need to run knex like this: ./node_modules/.bin/knex
if you don’t have knex installed globally )
knex migrate:latest
followed by:
knex seed:run
You should now have data to run your local instance of the project
You must provide some basic configation to the server via environment variables. Set the
following variables to tell the server how to connect to the db and index.
Name | Description |
---|---|
DATABASE_URL | A valid databae url to access the postgresql db, i.e. postgres://user:password@hostname:5432/dbname |
BONSAI_URL | A url to your Elastic search instance, e.g. http://localhost:9200/ |
ESINDEX | The name to use for the index name in elastic search, defaults to ‘datahub’ if not provided |
The package.json file includes a number of useful scripts for other tasks such as
When working on a new feature the convention is to follow
Github Flow.
Branch from master and work on changes in your branch. Once you are happy the feature is ready then make
sure you have linted the code and ran the tests. Make sure your commits don’t
contain extranous entries (such as wip) using rebase interactive and create a pull request. The
pull request title should briefly say what the change is, and the description describe how you did the change
and why you chose to do it the way you did.
Once a pull request is made it will be tested using CircleCI and, if successful,
deployed to a heroku instance. Links to the Circle build and deployed address will be
shown in the github pull request.
When a pull request is approved it can be merged to master.
All changes merged to master are auto deployed to Heroku and almost instantly available here.