Sunbird cQube
Ask or search…

SDC Deployment

Prerequisites to install cQube on local machine
  • Ubuntu 22.04 (supported)
  • 16 GB of System RAM (minimum requirement)
  • 4 core CPU (minimum requirement)
  • Domain name (with SSL), ex:
  • 250 GB Storage
Step 1: Use the following command to connect to the On-premise instance
ssh -i <path_to_the_pem_file> <user_name>@<public_ip_of_the_instance>
Ex: ssh -i poc_key.pem -o ServerAliveInterval=60 [email protected]
Step 2: Clone the cqube-devops repository using following command
Step 3: Navigate to the directory where cqube is cloned or downloaded and checkout to the desired branch(Release Branch)
cd cqube-devops/
git checkout release-v5.0.5(Latest release branch )
Step 4: After checkout of the latest branch, we need to move the SSL keys (.crt and.key) to the mentioned path, as shown in the below screenshot.
Command for moving ssl keys:
cp certificate.crt private.key /home/ubuntu/cqube-devops/ansible/ssl_certificates
Path: cqube-devops/ansible/ssl_certificates
Step 5: After copying the ssl keys similarly, we can move the VSK dimension to (state,district,grade,subject,medium) below the mentioned path, as shown in the screenshot below.
Command for moving dimensions:
(Note: This Step 5 is applicable for pulling the data from NVSK )
cp etc../home/ubuntu/cqube-devops/ansible.dimension_files
Path: cqube-devops/ansible/dimension_files.
Step 6: Give the following permissions to the file
sudo chmod u+x
Step 7: Install cqube with non root user with sudo privileges
sudo ./ file contains a shell script where it will run shell scripts and ansible-playbook to setup the cqube
Step 8: User Input Variables - These are the variables which need to be entered by the user by following the Hint provided
  • Access_type(Enter NVSK or VSK or Others)
  • state_code ( Enter the required state code by referring to the state list provided )
  • Do want to enable login screen for cqube instance(Enter true or false)
  • Do you want to pull the data from the NVSK server?(Enter true or false)
  • Please enter the end point to pull the data (Ex:
Step 9: Once you enter the above user input it will create one config file.please preview the config file and confirm if everything is correct.if it is correct type “no” and proceed and else type “yes” then correct it.
  • Mode of installation: Public
  • Storage_type : local
  • API_Endpoint(Enter the domain name ex:
  • Please enter the name of cert file( ex:certificate.crt)
  • Please enter the name of key file (ex: private.key)
Step 10: Optional_variables- Database credentials contain default values. If the user wishes to enter their own credentials then the user should opt for ‘yes’ to enter their credentials otherwise can opt for ‘no’ when the question pops up
  • db_user_name ( Enter the postgres database username )
  • db_name ( Enter the postgres database name )
  • db_password ( Enter the postgres password )
  • read_only_db_user( Enter the read only db user)
  • read_only_db_password( Enter the read only db password)
  • keycloak_adm_name( Enter the keycloak admin name)
  • keycloak_adm_password( Enter the keycloak password)
Step 11: Once the config file is generated, A preview of the config file is displayed followed by a question where the user gets an option to re enter the configuration values on choosing yes. If option no is selected then the moves to the next section.
Step 12: A preview of the program_selector.yml file is displayed followed by a question where the user gets an option to enable or disable the programs on choosing ‘yes’. If option ‘no’ is selected then the moves to the next section.
Step 13: Once the installation is completed, You will be prompted with the following messages and required reference urls.
cQube Installed Successfully
cQube ingestion api can be accessible using <domain_name>
  • After installation check the docker containers are up or not
  • Sudo docker ps -a
Ingestion Flow:
Setting up Postman:
  • Download the postman application and import the collection.
  • Select the import option in the postman then select the upload files to import the collection. Please refer to the below screenshot.
  • Upload the cQube_latest.postman_collection .json file and VSK_Schema.postman_collection.json,
  • After installations first need to create the jwt token in postman as shown below screenshot.
  • After creating the JWT token then upload the vsk_chema and dimension schema as shown below screenshot
Run the both schema at a time (click on 3 dots and select Run collection)
  • After uploading the grammar files,Just copy the generate token and add into national_programs and new programs api
  • Go to authorization and select Bearer token, then paste the token
  • Then I need to upload the VSK data(NCERT) as shown in the below screenshot.ii
  • We need to ingest only three programs (pgi,diksha,nishtha)data through API. The other three programs automatically pull the data from the NVSK server.
  • For above point we need to upload (PGI, DIKSHA, NISHTHA) three programs schema and ingest the raw files to the aws emission bucket.
  • Then we can schedule the below mentioned processor groups one by one by using schedule API as shown in the below screenshot
    • Run_adapters
    • data_moving_local
  • After that you can schedule the one by one program by using schedule API as mentioned in the below screenshot.
"processor_group_name": "ingest_data",
"scheduled_at": "0 31 14 * * ?",
"program_name": "nishtha"
  • Check the visualization in the UI dashboard. As shown in the below screenshot.