Skip to content
This repository was archived by the owner on Mar 12, 2025. It is now read-only.

First prod push #1

Merged
merged 10 commits into from
Aug 2, 2021
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
Initial code with permission model changes incorporated
  • Loading branch information
Vikas Agarwal committed Aug 2, 2021
commit 6449d60e0cc2f70cacf9c8a2e2a0776d4fcd3d87
159 changes: 157 additions & 2 deletions README.md
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1,2 +1,157 @@
# skills-api
V5 Skills API
# Skills API

* [Prerequisites](#prerequisites)
* [Configuration](#configuration)
* [Local deployment](#local-deployment)
* [Migrations](#migrations)
* [Local Deployment with Docker](#local-deployment-with-docker)
* [NPM Commands](#npm-commands)
* [JWT Authentication](#jwt-authentication)
* [Documentation](#documentation)

## Prerequisites

- node 12.x+
- npm 6.x+
- docker
- elasticsearch 7.7+
- PostgreSQL

## Configuration

Configuration for the application is at `config/default.js` and `config/production.js`. The following parameters can be set in config files or in env variables:

- LOG_LEVEL: the log level
- PORT: the server port
- AUTH_SECRET: TC Authentication secret
- VALID_ISSUERS: valid issuers for TC authentication
- PAGE_SIZE: the default pagination limit
- MAX_PAGE_SIZE: the maximum pagination size
- API_VERSION: the API version
- DB_NAME: the database name
- DB_USERNAME: the database username
- DB_PASSWORD: the database password
- DB_HOST: the database host
- DB_PORT: the database port
- ES_HOST: Elasticsearch host
- ES_REFRESH: Should elastic search refresh. Default is 'true'. Values can be 'true', 'wait_for', 'false'
- ELASTICCLOUD_ID: The elastic cloud id, if your elasticsearch instance is hosted on elastic cloud. DO NOT provide a value for ES_HOST if you are using this
- ELASTICCLOUD_USERNAME: The elastic cloud username for basic authentication. Provide this only if your elasticsearch instance is hosted on elastic cloud
- ELASTICCLOUD_PASSWORD: The elastic cloud password for basic authentication. Provide this only if your elasticsearch instance is hosted on elastic cloud
- ES.DOCUMENTS: Elasticsearch index, type and id mapping for resources.
- SKILL_INDEX: The Elastic search index for skill. Default is `skill`
- SKILL_ENRICH_POLICYNAME: The enrich policy for skill. Default is `skill-policy`
- TAXONOMY_INDEX: The Elastic search index for taxonomy. Default is `taxonomy`
- TAXONOMY_PIPELINE_ID: The pipeline id for enrichment with taxonomy. Default is `taxonomy-pipeline`
- TAXONOMY_ENRICH_POLICYNAME: The enrich policy for taxonomy. Default is `taxonomy-policy`
- MAX_BATCH_SIZE: Restrict number of records in memory during bulk insert (Used by the db to es migration script)
- MAX_BULK_SIZE: The Bulk Indexing Maximum Limits. Default is `100` (Used by the db to es migration script)


## Local deployment

Setup your Postgresql DB and Elasticsearch instance and ensure that they are up and running.

- Follow *Configuration* section to update config values, like database, ES host etc ..
- Goto *skills-api*, run `npm i`
- Create database using `npm run create-db`.
- Run the migrations - `npm run migrations up`. This will create the tables.
- Then run `npm run insert-data` and insert mock data into the database.
- Run `npm run migrate-db-to-es` to sync data with ES.
- Startup server `npm run start`

## Migrations

Migrations are located under the `./scripts/db/` folder. Run `npm run migrations up` and `npm run migrations down` to execute the migrations or remove the earlier ones

## Local Deployment with Docker

- Navigate to the directory `docker-pgsql-es` folder. Rename `sample.env` to `.env` and change any values if required.
- Run `docker-compose up -d` to have docker instances of pgsql and elasticsearch to use with the api

- Create database using `npm run create-db`.
- Run the migrations - `npm run migrations up`. This will create the tables.
- Then run `npm run insert-data` and insert mock data into the database.
- Run `npm run migrate-db-to-es` to sync data with ES.

- Navigate to the directory `docker`

- Rename the file `sample.env` to `.env`

- Set the required DB configurations and ElasticSearch host in the file `.env`

- Once that is done, run the following command

```bash
docker-compose up
```

- When you are running the application for the first time, It will take some time initially to download the image and install the dependencies

## NPM Commands

| Command                    | Description |
|--------------------|--|
| `npm run start` | Start app |
| `npm run start:dev` | Start app on any changes (useful during development). |
| `npm run lint` | Check for for lint errors. |
| `npm run lint:fix` | Check for for lint errors and fix error automatically when possible. |
| `npm run create-db` | Create the database |
| `npm run insert-data` | Insert data into the database |
| `npm run migrate-db-to-es` | Migrate data into elastic search from database |
| `npm run delete-data` | Delete the data from the database |
| `npm run migrations up` | Run up migration |
| `npm run migrations down` | Run down migration |
| `npm run generate:doc:permissions` | Generate [permissions.html](docs/permissions.html) |
| `npm run generate:doc:permissions:dev` | Generate [permissions.html](docs/permissions.html) on any changes (useful during development). |

## JWT Authentication
Authentication is handled via Authorization (Bearer) token header field. Token is a JWT token.

Here is a sample user token that is valid for a very long time for a user with administrator role.

```
<provide_in_forums>

# here is the payload data decoded from the token
{
"roles": [
"Topcoder User",
"administrator"
],
"iss": "https://api.topcoder.com",
"handle": "tc-Admin",
"exp": 1685571460,
"userId": "23166768",
"iat": 1585570860,
"email": "[email protected]",
"jti": "0f1ef1d3-2b33-4900-bb43-48f2285f9630"
}
```

and this is a sample M2M token with scopes `all:connect_project`, `all:projects` and `write:projects`.

```
<provided_in_forums>

# here is the payload data decoded from the token
{
"iss": "https://topcoder-dev.auth0.com/",
"sub": "enjw1810eDz3XTwSO2Rn2Y9cQTrspn3B@clients",
"aud": "https://m2m.topcoder-dev.com/",
"iat": 1550906388,
"exp": 2147483648,
"azp": "enjw1810eDz3XTwSO2Rn2Y9cQTrspn3B",
"scope": "all:connect_project all:projects write:projects",
"gty": "client-credentials"
}
```

These tokens have been signed with the secret `CLIENT_SECRET`. This secret should match the `AUTH_SECRET` entry in `config/default.js`. You can modify the payload of these tokens to generate tokens with different roles or different scopes using https://jwt.io

**Note** Please check with `src/constants.js` for all available user roles and M2M scopes.

## Documentation

- [permissions.html](docs/permissions.html) - the list of all permissions in Skills API.
- [swagger.yaml](docs/swagger.yaml) - the Swagger API Definition.
88 changes: 88 additions & 0 deletions app.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
/**
* The application entry point
*/

require('./src/bootstrap')
const config = require('config')
const express = require('express')
const cross = require('cors')
const bodyParser = require('body-parser')
const _ = require('lodash')
const http = require('http')
const swaggerUi = require('swagger-ui-express')
const jsyaml = require('js-yaml')
const fs = require('fs')
const path = require('path')
const logger = require('./src/common/logger')
const errorMiddleware = require('./src/common/error.middleware')
const routes = require('./src/route')
const { permissions, jwtAuthenticator } = require('tc-core-library-js').middleware
const app = express()
const httpServer = http.Server(app)
const models = require('./src/models')
const initPermissions = require('./src/permissions')

app.set('port', config.PORT)
app.use(bodyParser.json())
app.use(bodyParser.urlencoded({ extended: true }))
app.use(cross())
const apiRouter = express.Router({})

// load all routes
_.each(routes, (verbs, url) => {
_.each(verbs, (def, verb) => {
if (!def.method) {
throw new Error(`${verb.toUpperCase()} ${url} method is undefined`)
}
if (def.auth && def.auth !== 'jwt') {
throw new Error(`auth type "${def.auth}" is not supported`)
}

const actions = []
// Authentication
if (def.auth) {
actions.push((req, res, next) => {
jwtAuthenticator(_.pick(config, ['AUTH_SECRET', 'VALID_ISSUERS']))(req, res, next)
})
}
// Authorization
if (def.permission) {
actions.push(permissions(def.permission))
}
// main middleware
actions.push(async (req, res, next) => {
try {
await def.method(req, res, next)
} catch (e) {
next(e)
}
})

logger.info(`Endpoint discovered : ${verb.toLocaleUpperCase()} /${config.API_VERSION}${url}`)
apiRouter[verb](`/${config.API_VERSION}${url}`, actions)
})
})
app.use('/', apiRouter)
const spec = fs.readFileSync(path.join(__dirname, 'docs/swagger.yaml'), 'utf8')
const swaggerDoc = jsyaml.safeLoad(spec)

app.use('/docs', swaggerUi.serve, swaggerUi.setup(swaggerDoc))

app.use(errorMiddleware())
app.use('*', (req, res) => {
const pathKey = req.baseUrl.substring(config.API_VERSION.length + 1)
const route = routes[pathKey]
if (route) {
res.status(405).json({ message: 'The requested method is not supported.' })
} else {
res.status(404).json({ message: 'The requested resource cannot found.' })
}
});

(async () => {
await models.init()
initPermissions() // initialize permission policies
httpServer.listen(app.get('port'), () => {
logger.info(`Express server listening on port ${app.get('port')}`)
})
})()
51 changes: 51 additions & 0 deletions config/default.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
/**
* the default config
*/

module.exports = {
LOG_LEVEL: process.env.LOG_LEVEL || 'debug',
PORT: process.env.PORT || 3001,

AUTH_SECRET: process.env.AUTH_SECRET || 'CLIENT_SECRET',
VALID_ISSUERS: process.env.VALID_ISSUERS ? process.env.VALID_ISSUERS.replace(/\\"/g, '')
: '["https://topcoder-dev.auth0.com/", "https://api.topcoder.com"]',

PAGE_SIZE: process.env.PAGE_SIZE || 20,
MAX_PAGE_SIZE: parseInt(process.env.MAX_PAGE_SIZE) || 100,
API_VERSION: process.env.API_VERSION || 'api/1.0',

DB_NAME: process.env.DB_NAME || 'skills-db',
DB_USERNAME: process.env.DB_USER || 'postgres',
DB_PASSWORD: process.env.DB_PASSWORD || 'password',
DB_HOST: process.env.DB_HOST || 'localhost',
DB_PORT: process.env.DB_PORT || 5432,

// ElasticSearch
ES: {
HOST: process.env.ES_HOST || 'http://localhost:9200',
ES_REFRESH: process.env.ES_REFRESH || 'true',

ELASTICCLOUD: {
id: process.env.ELASTICCLOUD_ID,
username: process.env.ELASTICCLOUD_USERNAME,
password: process.env.ELASTICCLOUD_PASSWORD
},

// es mapping: _index, _type, _id
DOCUMENTS: {
skill: {
index: process.env.SKILL_INDEX || 'skill',
type: '_doc',
enrichPolicyName: process.env.SKILL_ENRICH_POLICYNAME || 'skill-policy'
},
taxonomy: {
index: process.env.TAXONOMY_INDEX || 'taxonomy',
type: '_doc',
pipelineId: process.env.TAXONOMY_PIPELINE_ID || 'taxonomy-pipeline',
enrichPolicyName: process.env.TAXONOMY_ENRICH_POLICYNAME || 'taxonomy-policy'
}
},
MAX_BATCH_SIZE: parseInt(process.env.MAX_BATCH_SIZE, 10) || 10000,
MAX_BULK_SIZE: parseInt(process.env.MAX_BULK_SIZE, 10) || 100
}
}
7 changes: 7 additions & 0 deletions config/production.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
/**
* The production configuration file.
*/

module.exports = {
LOG_LEVEL: process.env.LOG_LEVEL || 'info'
}
23 changes: 23 additions & 0 deletions docker-pgsql-es/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
version: '3'
services:
postgres:
image: "postgres:12.4"
volumes:
- database-data:/var/lib/postgresql/data/
ports:
- ${DB_PORT}:${DB_PORT}
environment:
POSTGRES_PASSWORD: ${DB_PASSWORD}
POSTGRES_USER: ${DB_USERNAME}
POSTGRES_DB: ${DB_NAME}
esearch:
image: elasticsearch:7.7.1
container_name: skills-data-processor-es_es
ports:
- ${ES_PORT}:${ES_PORT}
environment:
- discovery.type=single-node

volumes:
database-data:

5 changes: 5 additions & 0 deletions docker-pgsql-es/sample.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
DB_NAME=skills-db
DB_USERNAME=postgres
DB_PASSWORD=password
DB_PORT=5432
ES_PORT=9200
17 changes: 17 additions & 0 deletions docker/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# Use the base image with Node.js 12
FROM node:12

# Set working directory for future use
WORKDIR /skills_api

# Copy the current directory into the Docker image
COPY . /skills_api

# Install the dependencies from package.json
RUN npm install

# Expose port
EXPOSE ${PORT}

# start api
CMD npm start
12 changes: 12 additions & 0 deletions docker/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
version: '3'
services:
skills_api:
image: skills_api:latest
build:
context: ../
dockerfile: docker/Dockerfile
env_file:
- .env
ports:
- ${PORT}:${PORT}

8 changes: 8 additions & 0 deletions docker/sample.env
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
DB_NAME=skills-db
DB_USERNAME=postgres
DB_PASSWORD=password
DB_HOST=host.docker.internal
DB_PORT=5432

ES_HOST=http://host.docker.internal:9200
PORT=3001
Loading