AWS Deployment

Here the details for the complete deployment process

Prerequisites to install cQube on AWS machine

  • Ubuntu 22.04 (supported)

  • 16 GB of System RAM (minimum requirement)

  • 4 core CPU (minimum requirement)

  • Domain name ex: https://cqube-demo-cert.tibilprojects.com

  • SSl certificate keys ex: certificate.crt and private.key

  • AWS access_key and secret_key

  • 250 GB Storage

Step 1: Use the following command to connect to the AWS instance

ssh -i <path_to_the_pem_file> <user_name>@<public_ip_of_the_instance>

Ex: ssh -i poc_key.pem -o ServerAliveInterval=60 ubuntu@13.200.12.31

Step 2: Clone the cqube-devops repository using following command

git clone https://github.com/Sunbird-cQube/cqube-devops.git

Step 3: Navigate to the directory where cqube is cloned or downloaded and checkout to the desired branch(Release Branch)

cd cqube-devops/

git checkout release-v5.0.5(latest release branch)

Step 4: After checkout of the latest branch, we need to move the SSL keys (.crt and.key) to the mentioned path, as shown in the below screenshot.

Path: cqube-devops/ansible/ssl_certificates

Command for moving ssl keys:

cp certificate.crt private.key /home/ubuntu/cqube-devops/ansible/ssl_certificates

Step 5: After copying the ssl keys similarly, we can move the VSK dimension to (state,district,grade,subject,medium) below the mentioned path, as shown in the screenshot below.

Command for moving dimensions:

(Note: This Step 5 is applicable for pulling the data from NVSK )

cp state-dimension.data.csv grade-dimension.data.csv subject-dimension.data.csv etc../home/ubuntu/cqube-devops/ansible.dimension_files

Path: cqube-devops/ansible/dimension_files.

Step 6: Give the following permissions to the install.sh file

sudo chmod u+x install.sh

Step 7: Install cqube with non root user with sudo privileges

sudo ./install.sh

  • Access_type(Enter NVSK or VSK or Others)

  • state_code ( Enter the required state code by referring to the state list provided )

  • Do want to enable login screen for cqube instance(Enter true or false)

  • Do you want to pull the data from the NVSK server?(Enter true or false)

  • Please enter the end point to pull the data (Ex: cqube-demo-nvsk.tibilprojects.com)

Step 8: User Input Variables - These are the variables which need to be entered by the user by following the Hint provided

Install.sh file contains a shell script where it will run shell scripts and ansible-playbook to setup the cQube

  • Mode of installation: Public

  • Storage_type : aws

  • API_Endpoint(Enter the domain name : (ex: cqube-demo-cert.tibilprojects.com)

  • Please enter the name of cert file: ( ex:certificate.crt)

  • Please enter the name of key file: (ex: private.key)

Step 9: Once you enter the above user input it will create one config file.please preview the config file and confirm if everything is correct.if it is correct type “no” and proceed and else type “yes” then correct it.

  • db_user_name ( Enter the postgres database username )

  • db_name ( Enter the postgres database name )

  • db_password ( Enter the postgres password )

  • read_only_db_user( Enter the read only db user)

  • read_only_db_password( Enter the read only db password)

  • keycloak_adm_name( Enter the keycloak admin name)

  • keycloak_adm_password( Enter the keycloak password)

Step 10: Optional_variables- Database credentials contain default values. If the user wishes to enter their own credentials then the user should opt for ‘yes’ to enter their credentials otherwise can opt for ‘no’ when the question pops up

Step 11: Once the config file is generated, A preview of the config file is displayed followed by a question where the user gets an option to re enter the configuration values on choosing yes. If option no is selected then the install.sh moves to the next section.

Step 12: A preview of the program_selector.yml file is displayed followed by a question where the user gets an option to enable or disable the programs on choosing ‘yes’. If option ‘no’ is selected then the install.sh moves to the next section.

Step 13: Once the installation is completed, You will be prompted with the following messages and required reference urls.

cQube Installed Successfully

cQube ingestion api can be accessible using <domain_name>

  • After installation check the docker containers are up or not

  • Sudo docker ps -a

Ingestion Flow:

Setting up Postman:

  • Download the postman application and import the collection.

  • Select the import option in the postman then select the upload files to import the collection. Please refer to the below screenshot.

  • Upload the cQube_latest.postman_collection .json file and VSK_Schema.postman_collection.json,

https://drive.google.com/drive/u/0/folders/12Wn7UIHgUhq6U3GzlRN-zdrJMnj1hrPO

  • After installations first need to create the jwt token in postman as shown below screenshot.

API_Endpoint: https://cqube-ssl-test.tibilprojects.com/api/ingestion/generatejwt

  • After creating the JWT token then upload the vsk_schema and dimension schema as shown below screenshot

API Endpoint: htpps://cqube-ssl-test.tibilprojects.com/api/spec/event

Run the both schema at a time (click on 3 dots and select Run collection)

  • After uploading the grammar files,Just copy the generate token and add into national_programs and new programs

  • Go to authorization and select Bearer token, then paste the token

  • Then I need to upload the VSK data(NCERT) as shown in the below screenshot.

API Endpoint: https://cqube-ssl-test.tibilprojects.com/api/ingestion/national_programs

  • We need to ingest only three programs (pgi,diksha,nishtha)data through API. The other three programs automatically pull the data from the NVSK server.

  • For above point we need to upload (PGI, DIKSHA, NISHTHA) three programs schema and ingest the raw files to the aws emission bucket.

  • Then we can schedule the below mentioned processor groups one by one by using schedule API as shown in the below screenshot

    • Run_adapters

    • data_moving_aws

  • After the schedule, the programs should one by one by using schedule API as mentioned in the below screenshot.

    Bbody:

    {

    "processor_group_name": "ingest_data",

    "scheduled_at": "0 31 14 * * ?",

    "program_name": "nishtha"

    }

  • Check the visualization in the UI dashboard. As shown in the below screenshot.

Last updated