Interview Questions

Devops Interview Questions and Answers

Devops Interview Questions and answers

Interview Questions November 2, 2019

DevOps Interview Questions and answers for beginners and experts. List of frequently asked DevOps Interview Questions with answers by Besant Technologies. We hope these DevOps Interview Questions and answers are useful and will help you to get the best job in the networking industry. This DevOps Interview Questions and answers are prepared by DevOps Professionals based on MNC Companies’ expectations. Stay tuned we will update New DevOps Interview questions with Answers Frequently. If you want to learn Practical DevOps Training then please go through this DevOps Training in Chennai and DevOps Training in Bangalore .

Best DevOps Interview Questions and answers

Besant Technologies supports the students by providing DevOps Interview Questions and answers for the job placements and job purposes. DevOps is the leading important course in the present situation because more job openings and the high salary pay for this DevOps and more related jobs. We provide the DevOps Online Training also for all students around the world through the Gangboard medium. These are top DevOps Interview Questions and answers, prepared by our institute experienced trainers.

DevOps Interview Questions and answers for the job placements

Here is the list of most frequently asked DevOps Interview Questions and Answers in technical interviews. These questions and answers are suitable for both freshers and experienced professionals at any level. The questions are for intermediate to somewhat advanced Devopsprofessionals, but even if you are just a beginner or fresher you should be able to understand the answers and explanations here we give.

These DevOps Interview Questions and Answers will guide you to clear the following Certifications

  • DevOps Foundation
  • DevOps Leader(DOL)
  • DevOps Test Engineering(DTE)
Q1.What is Infrastructure as Code?

Answer: Where the Configuration of any servers or toolchain or application stack required for an organization can be made into more descriptive level of code and that can be used for provisioning and manage infrastructure elements like Virtual Machine, Software, Network Elements, but it differs from scripts using any language, where they are series of static steps coded, where Version control can be used in order to track environment changes.Example Tools are Ansible, Terraform.

Q2.What are the areas the Version control can introduce to get efficient DevOps practice?

Answer: Obviously the main area of Version Control is Source code management, Where every developer code should be pushed to the common repository for maintaining build and release in CI/CD pipelines.Another area can be Version control For Administrators when they use Infrastructure as A Code (IAC) tools and practices for maintaining The Environment configuration.Another Area of Version Control system Can be Artifactory Management Using Repositories like Nexus & DockerHub.

Q3.Why the Opensource tools boost DevOps?

Answer: Opensource tools predominantly used by any organization which is adapting (or) adopted DevOps pipelines because devops came with the focus of automation in various aspects of organization build and release and change management and also infrastructure management areas.

So developing or using a single tool is impossible and also everything is basically trial and error phase of development and also agile cuts down the luxury of developing a single tool , so opensource tools were available on the market pretty much saves every purpose and also gives organization an option to evaluate the tool based on their need .

Q4.What is the difference between Ansible and chef(or) puppet?

Answer: Ansible is Agentless configuration management tool, where puppet or chef needs agent needs to be run on the agent node and chef or puppet is based on pull model, where your cookbook or manifest for chef and puppet respectively from the master will be pulled by the agent and ansible uses ssh to communicate and it gives data-driven instructions to the nodes need to be managed , more like RPC execution, ansible uses YAML scripting, whereas puppet (or) chef is built by ruby uses their own DSL .

Q5.What is folder structure of roles in ansible?

Answer:

  • roles/
  • common/
  • tasks/
  • handlers/
  • files/
  • templates/
  • vars/
  • defaults/
  • meta/
  • webservers/
  • tasks/
  • defaults/
  • meta/

Where common is role name, under tasks – there will be tasks (or) plays present, handlers – to hold the handlers for any tasks, files – static files for copying (or) moving to remote systems, templates- provides to hold jinja based templating , vars – to hold common vars used by playbooks.

Q6. What is Jinja2 templating in Ansible playbooks and their use?

Answer: Jinja2 templating is the Python standard for templating , think of it like a sed editor for Ansible , where it can be used is when there is a need for dynamic alteration of any config file to any application like consider mapping a MySQL application to the IP address of the machine, where it is running, it cannot be static , it needs altering it dynamically at runtime .

Format

{{ foo.bar }}

The vars within the {{ }} braces are replaced by ansible while running using template module.

Q7. What is the need for organizing playbooks as the role, is it necessary?

Answer: Organizing playbooks as roles , gives more readability and reusability to any plays , while consider a task where MySQL installation should be done after the removal of Oracle DB , and another requirement is needed to install MySQL after java installation, in both cases we need to install MySQL , but without roles need to write playbooks separately for both use cases , but using roles once the MySQL installation role is created can be utilised any number of times by invoking using logic in site.yaml .

No, it is not necessary to create roles for every scenario, but creating roles is a best practice in Ansible.

Q8.What is main disadvantage of Docker containers?

Answer: As the lifetime of any containers is while running after a container is destroyed you cannot retrieve any data inside a container, the data inside a container is lost forever, but persistent storage for data inside containers can be done using volumes mount to an external source like host machine and any NFS drivers.

Q9. What are docker engine and docker compose?

Answer: Docker engine contacts the docker daemon inside the machine and creates the runtime environment and process for any container, docker composes links several containers to form as a stack used in creating application stacks like a LAMP, WAMP, XAMP.

Q10. What are the Different modes does a container can be run?

Answer: Docker container can be run in two modes

Attached: Where it will be run in the foreground of the system you are running, provides a terminal inside to container when -t option is used with it, where every log will be redirected to stdout screen.

Detached: This mode is usually run in production, where the container is detached as a background process and every output inside the container will be redirected log files inside /var/lib/docker/logs/<container-id>/<container-id.json> and which can be viewed by docker logs command.

Q11. What the output of docker inspect command will be?

Answer: Docker inspects <container-id> will give output in JSON format, which contains details like the IP address of the container inside the docker virtual bridge and volume mount information and every other information related to host (or) container specific like the underlying file driver used, log driver used.

docker inspect [OPTIONS] NAME|ID [NAME|ID…]

Options
Name, shorthand Default Description
— format, -f Format the output using the given Go template
–size , -s Display total file sizes if the type is container
–type Return JSON for specified type

Q12.What is the command can be used to check the resource utilization by docker containers?

Answer: Docker stats command can be used to check the resource utilization of any docker container, it gives the output analogous to Top command in Linux, it forms the base for container resource monitoring tools like advisor, which gets output from docker stats command.

docker stats [OPTIONS] [CONTAINER…]

Options
Name, shorthand Default Description
— all, -a Show all containers (default shows just running)
–format Pretty-print images using a Go template
–no-stream Disable streaming stats and only pull the first result
–no-trunc Do not truncate output

Q13.What is the major difference between Continuos deployment and continuos delivery?

Answer: Where continuos deployment is fully automated and deploying to production needs no manual intervention in continuos deployment, whereas in continuos delivery the deployment to production has some manual intervention for change management in Organization for better management, and it needs to approved by manager or higher authorities to be deployed in production. According to your application risk factor for organization, the continuos deployment (or) delivery approach will be chosen.

Q14.How to execute some task (or) play on localhost only while executing playbooks on different hosts on an ansible?

Answer: In ansible, there is a module called delegate_to, in this module section provide the particular host (or) hosts where your tasks (or) task need to be run.
tasks:
– name: ” Elasticsearch Hitting”
uri: url='{{ url2 }}_search?q=status:new’ headers='{“Content-type”:”application/json”}’ method=GET return_content=yes
register: output
delegate_to: 127.0.0.1

Q15. What is the difference between set_fact and vars in ansible?

Answer: Where a set_fact sets the value for a factor at one time and remains static, even though the value is quite dynamic and vars keep on changing as per the value keeps on changing for the variable.

tasks:

– set_fact:

fact_time: “Fact: {{lookup(‘pipe’, ‘date \”+%H:%M:%S\”‘)}}”

– debug: var=fact_time

– command: sleep 2

– debug: var=fact_time

tasks:

– name: lookups in variables vs. lookups in facts

hosts: localhost

vars:

var_time: “Var: {{lookup(‘pipe’, ‘date \”+%H:%M:%S\”‘)}}”

Even though the lookup for date has been used in both the cases , where in the vars is used it alters based on the time to time every time executed within the playbook lifetime. But Fact always remains same once lookup is done

Q16. What is the lookup in ansible and what are lookup plugins supported by ansible?

Answer: Lookup plugins allow access of data in Ansible from outside sources. These plugins are evaluated on the Ansible control machine, and can include reading the filesystem but also contacting external datastores and services.

Format is {lookup{‘<plugin>’,’<source(or)connection_string>’}}

Some of the lookup plugins supported by ansible are

File

pipe

redis

jinja templates

etcd kv store …

Q17. How can you delete the docker images stored on your local machine and how can you do it for all the images at once?

Answer: The command docker rmi <image-id> can be used to delete the docker image from local machine , whereas some images may need to be forced because the image may be used by some other container (or) another image , to delete images you can use the combination of commands by docker rmi $(docker images -q) , where docker images will give the docker image names , to get only the ID of docker images only , we are using -q switch with docker images command.

Q18. What are the folders in the Jenkins installation and their uses?

Answer: JENKINS_HOME – which will be /$JENKINS_USER/.jenkins it is the root folder of any Jenkins installation and it contains subfolders each for different purposes.

jobs/ – Folder contains all the information about all the jobs configured in the Jenkins instance.

Inside jobs/, you will have the folder created for each job and inside those folders, you will have build folders according to each build numbers each build will have its log files, which we see in Jenkins web console.

Plugins/ – where all your plugins will be listed.

Workspace/ – this will be present to hold all the workspace files like your source code pulled from SCM.

Q19. What are the ways to configure Jenkins system?

Answer: Jenkins can be configured in two ways

Web: Where there is an option called configure system , in there section you can make all configuration changes .

Manual on filesystem: Where every change can also be done directly on the Jenkins config.xml file under the Jenkins installation directory , after you make changes on the filesystem, you need to restart your Jenkins, either can do it directly from terminal (or) you can use Reload configuration from disk under manage Jenkins menu or you can hit /restart endpoint directly.

Q20. What is the role Of HTTP REST API in DevOps?

Answer: As Devops is purely focuses on Automating your infrastructure and provides changes over the pipeline for different stages like an each CI/CD pipeline will have stages like build,test,sanity test,UAT,Deployment to Prod environment as with each stage there are different tools is used and different technology stack is presented and there needs to be a way to integrate with different tool for completing a series toolchain, there comes a need for HTTP API , where every tool communicates with different tools using API , and even user can also use SDK to interact with different tools like BOTO for Python to contact AWS API’s for automation based on events , nowadays its not batch processing anymore , it is mostly event driven pipelines

Q21. What are Microservices, and how they power efficient DevOps practices?

Answer: Where In traditional architecture , every application is monolith application means that anything is developed by a group of developers , where it has been deployed as a single application in multiple machines and exposed to outer world using loadbalancers , where the microservices means breaking down your application into small pieces , where each piece serves the different functionality needed to complete a single transaction and by breaking down , developers can also be formed to groups and each piece of application may follow different guidelines for efficient development phase , because of agile development should be phased up a bit and every service uses REST API (or) Message queues to communicate between other service. So build and release of a non-robust version may not affect whole architecture , instead some functionality is lost , that provides the assurance for efficient and faster CI/CD pipelines and DevOps Practices

Q22. What are the ways that a pipeline can be created in Jenkins?

Answer: There are two ways of the pipeline can be created in Jenkins

Scripted Pipelines:

More like a programming approach

Declarative pipelines:

DSL approach specifically for creating Jenkins pipelines.

The pipeline should be created in Jenkins file and the location can either be in SCM or local system.

Declarative and Scripted Pipelines are constructed fundamentally differently. Declarative Pipeline is a more recent feature of Jenkins Pipeline which:

Provides richer syntactical features over Scripted Pipeline syntax, and

is designed to make writing and reading Pipeline code easier.

Q23. What are the Labels in Jenkins & where it can be utilised?

Answer: As with CI/CD solution needs to be centralized , where every application in the organization can be built by a single CI/CD server , so in organization there may be different kinds of application like java , c#,.NET and etc , as with microservices approach your programming stack is loosely coupled for the project , so you can have Labels in each node and select the option Only built jobs while label matching this node , so when a build is scheduled with the label of the node present in it , it waits for next executor in that node to be available , eventhough there are other executors in nodes.

Q24. What is the use of Blueocean in Jenkins?

Answer: Blue Ocean rethinks the user experience of Jenkins. Designed from the ground up for Jenkins Pipeline, but still compatible with freestyle jobs, Blue Ocean reduces clutter and increases clarity for every member of the team.

It provides sophisticated UI to identify each stage of the pipeline and better pinpointing for issues and very rich Pipeline editor for beginners.

Q25. What is the callback plugins in ansible, give some examples of some callback plugins?

Answer: Callback plugins enable adding new behaviors to Ansible when responding to events. By default, callback plugins control most of the output you see when running the command line programs, but can also be used to add additional output, integrate with other tools and marshall the events to a storage backend. So whenever an play is executed and after it produces some events , that events are printed onto Stdout screen ,so callback plugin can be put into any storage backend for log processing.

Example callback plugins are ansible-logstash, where every playbook execution is fetched by logstash in the JSON format and can be integrated any other backend source like elasticsearch.

Q26. What are the scripting languages can be used in DevOps?

Answer: As with scripting languages , the basic shell scripting is used for build steps in Jenkins pipelines and python scripts can be used with any other tools like Ansible , terraform as a wrapper script for some other complex decision solving tasks in any automation as python is more superior in complex logic derivation than shell scripts and ruby scripts can also be used as build steps in Jenkins.

Q27. What is Continuos Monitoring and why monitoring is very critical in DevOps?

Answer: Devops brings out every organization capablity of build and release cycle to be much shorter with concept of CI/CD , where every change is reflected into production environments fastly , so it needs to be tightly monitored to get customer feedbacks. So the concept of continuos monitoring has been used to evaluate each application performance in real time (atleast Near Real Time) , where each application is developed with application performance monitor agents compatible and the granular level of metrics are taken out like JVM stats and even fuctional wise metrics inside the application can also be poured out in real time to Agents , which in turn gives to any backend storage and that can be used by monitoring teams in dashboards and alerts to get continuosly monitor the application

Q28. Give some examples of continuos monitoring tools?

Answer: Where many continuos monitoring tools are available in the market, where used for a different kind of application and deployment model

Docker containers can be monitored by cadvisor agent , which can be used by Elasticsearch to store metrics (or) you can use TICK stack (Telegraf, influxdb,Chronograf,Kapacitor) for every systems monitoring in NRT(Near Real Time) and You can use Logstash (or) Beats to collect Logs from system , which in turn can use Elasticsearch as Storage Backend can use Kibana (or) Grafana as visualizer.

The system monitoring can be done by Nagios and Icinga.

Q29. What is docker swarm?

Answer: Group of Virtual machines with Docker Engine can be clustered and maintained as a single system and the resources also being shared by the containers and docker swarm master schedules the docker container in any of the machines under the cluster according to resource availability

Docker swarm init can be used to initiate docker swarm cluster and docker swarm join with the master IP from client joins the node into the swarm cluster.

Q30. What are the ways to create Custom Docker images?

Answer: Docker images can be created by two ways broadly

Dockerfile: Most used method , where base image can be specified and the files can be copied into the image and installation and configuration can be done using declarative file which can be given to Docker build command to produce new docker image.

Docker commit: Where the Docker image is pinned up as a Docker container and every command execute inside a container forms a Read-only layer and after every changes is Done can use docker commit <container-iD> to save as a image, although this method is not suitable for CI/CD pipelines , as it requires manual intervention.

Q31. Give some important directives in Dockerfile and an example Dockerfile?

Answer: FROM – Gives the base image to use.

RUN – this directive used to run a command directly into any image.

CMD- To run the command, but the format of command specification is more arguments based than a single command like RUN.

ADD (or) COPY – To copy files from your local machine to Docker images you create.

ENTRYPOINT- Entrypoint command keeps the command without execution, so when the container is spawned from an image, the command in entry point runs first.

Example Dockerfile
FROM python:2
MAINTAINER janakiraman
RUN mkdir /code
ADD test.py /code
ENTRYPOINT [“python”,”/code/test.py”]

Q32. Give some important Jenkins Plugins

Answer:

  • SSH slaves plugin
  • Pipeline Plugin
  • Github Plugin
  • Email notifications plugin
  • Docker publish plugin
  • Maven plugin
  • Greenball plugin
Q33.What is the use of vaults in ansible?

Answer: Vault files are encrypted files, which contains any variables used by ansible playbooks, where the vault encrypted files can be decrypted only by the vault-password, so while running a playbook, if any vault file is used for a variable inside playbooks, so need to used –-ask-vault-pass command argument while running playbook.

Q34. How does docker make deployments easy ?

Answer: Docker is a containerization technology, which is a advanced technology over virtualization, where in virtualization, an application needs to be installed in machine , then the OS should be spin up and spinning up Virtual machine takes lot time , and it divides space from Physical hardware and hypervisor layer wastes vast amount of space for running virtual machines and after it is provisioned, Every application needs to be installed and installation requires all dependencies and sometimes dependencies may miss out even if you double check and migration from machine to machine of applications is painful , but docker shares underlying OS resources , where docker engine is lightweight and every application can be packaged with dependency once tested works everywhere same, migration of application or spinning up of new application made easy because just needs to install only docker in another machine and docker image pull and run does all the magic of spinning up in seconds.

Q35. How .NET applications can be built using Jenkins?

Answer: .NET applications needs Windows nodes to built , where Jenkins can use Jenkins windows slave plugin can be used to connect windows node as a Jenkins slave , where it uses DCOM connector for Jenkins master to slave connection (or) you can use Jenkins JNLP connector and the Build tools and SCM tools used for the pipeline of .NET application needs to be installed in the Windows slave and MSBuild build tool can be used to build .NET application and can be Deployed into Windows host by using Powershell wrapper inside Ansible playbooks.

Q36. How can you make a High available Jenkins master-master solution without using any Jenkins plugin?

Answer: Where Jenkins stores all the build information in the JENKINS_HOME directory , which can be mapped to any NFS (or) SAN storage drivers , common file systems and when the node is down , can implement a monitoring solution using Nagios to check alive , if down can trigger an ansible playbook (or) python script to create a new Jenkins master in different node and reload at runtime, if there is already a passive Jenkins master in another instance kept silent with same JENKINS_HOME Network file store.

Q37. Give the structure of Jenkins file?

Answer: Jenkins filed starts with Pipeline directive , inside the pipeline directive will be agent directive , which specifies where the build should be run and next directive would be stages , which contains several list of stage directives and each stage directive contains different steps . There are several optional directives like options , which provides custom plugins used by the projects (or) any other triggering mechanisms used and environment directive to provide all env variables

Sample Jenkins file

pipeline{

agent any

stages

{

stage(‘Dockerbuild’)

{

steps

{

sh “sudo docker build. -t pyapp:v1”

}

}

}

}

Q38. What are the uses of integrating cloud with DevOps?

Answer: The centralized nature of cloud computing provides DevOps automation with a standard and centralized platform for testing, deployment, and production.Most cloud providers gives Even DevOps technologies like CI tools and deployment tools as a service like codebuild, codepipeline, codedeploy in AWS makes easy and even faster rate of DevOps pratice.

Q39. What is Orchestration of containers and what are the different tools used for orchestration?

Answer: When deploying into production, you cannot use a single machine for production as it is not robust for any deployment , so when an application is containerized, the stack of applications maybe run at single docker host in development environment to check application functionality, while when we arrive into production servers, that it is not the case, where you should deploy your applications into multiple nodes and stack should be connected between nodes , so to ensure network connectivity between different containers , you need to have shell scripts (or) ansible playbooks between different nodes ,and another disadvantage is using this tools , you cannot run an efficient stack, where an application is taking up more resources in one node , but another sits idle most time , so deployment strategy also needs to be planned out according to resources and load-balancing of this applications also be configured, so to clear out all this obstacles , there came a concept called orchestration , where your docker containers is orchestrated between different nodes in the cluster based on resources available according to scheduling strategy and everything should be given as DSL specific files not like scripts .There are Different Orchestration tools available in market which are Kubernetes,Swarm,Apache Mesos.

Q40. What is ansible tower?

Answer: Ansible is developed by Redhat , which provides IT automation and configuration management purposes. Ansible Tower is the extended management layer created to manage playbooks organization using roles and execution and can even chain different number of playbooks to form workflows. Ansible tower dashboard provides NOC-style UI to look into the status of all ansible playbooks and hosts status.

Q41. What are the programming language applications that can be built by Jenkins?

Answer: Jenkins is a CI/CD tool not depends on any Programming language for building application, if there is a build tool to built any language, that’s enough to build, even though plugin for build tool not available, can use any scripting to replace your build stage like Shell, Powershell, Python scripts to make build of any language application.

Q42. Why is every tool in DevOps is mostly has some DSL (Domain Specific Language)?

Answer: DevOps is culture developed to address the needs of agile methodology , where the developement rate is faster ,so deployment should match its speed and that needs operations team to co-ordinate and work with dev team , where everything can automated using script-based , but it feels more like operations team than , it gives messy organization of any pipelines , more the use cases , more the scripts needs to be written , so there are several use cases, which will be adequate to cover the needs of agile are taken and tools are created according to that and customiztion can happen over the tool using DSL to automate the DevOps practice and Infra management.

Q43. What are the clouds can be integrated with Jenkins and what are the use cases?

Answer: Jenkins can be integrated with different cloud providers for different use cases like dynamic Jenkins slaves, Deploy to cloud environments.

Some of the clouds can be integrated are

  • AWS
  • Azure
  • Google Cloud
  • OpenStack
Q44. What are Docker volumes and what type of volume should be used to achieve persistent storage?

Answer: Docker volumes are the filesystem mount points created by user for a container or a volume can be used by many containers , and there are different types of volume mount available empty dir, Post mount, AWS backed lbs volume, Azure volume, Google Cloud (or) even NFS, CIFS filesystems, so a volume should be mounted to any of the external drive to achieve persistent storage , because a lifetime of files inside container , is till the container is present and if container is deleted, the data would be lost.

Q45. What are the Artifacts repository can be integrated with Jenkins?

Answer: Any kind of Artifacts repository can be integrated with Jenkins, using either shell commands (or) dedicated plugins, some of them are Nexus, Jfrog.

Q46. What are the some of the testing tools that can be integrated with jenkins and mention their plugins?

Answer: Sonar plugin – can be used to integrate testing of Code quality in your source code.

Performance plugin – this can be used to integrate JMeter performance testing.

Junit – to publish unit test reports.

Selenium plugin – can be used to integrate with selenium for automation testing.

Q47. What are the build triggers available in Jenkins?

Answer: Builds can be run manually (or) either can automatically triggered by different sources like

Webhooks– The webhooks are API calls from SCM , whenever a code is committed into repository (or) can be done for specific events into specific branches.

Gerrit code review trigger– Gerrit is an opensource code review tool, whenever a code change is approved after review build can be triggered.

Trigger Build Remotely – You can have remote scripts in any machine (or) even AWS lambda functions (or) make a post request to trigger builds in Jenkins.

Schedule Jobs- Jobs can also schedule like Cron jobs.

Poll SCM for changes – Where your Jenkins looks for any changes in SCM for given interval, if there is a change, the build can be triggered.

Upstream and Downstream Jobs– Where a build can be triggered by another job that is executed previously.

Q48. How to Version control Docker images?

Answer: Docker images can be version controlled using Tags , where you can assign tag to any image using docker tag <image-id> command. And if you are pushing any docker hub registry without tagging the default tag would be assigned which is latest , even if a image with the latest is present , it demotes that image without tag and reassign that to the latest push image.

Q49. What is the use of Timestamper plugin in Jenkins?

Answer: It adds Timestamp to every line to the console output of the build.

Q50.Why should you not execute a build on master?

Answer: You can run a build on master in Jenkins , but it is not advisable , because the master already has the responsibility of scheduling builds and getting build outputs into JENKINS_HOME directory ,so if we run a build on Jenkins master , then it additionally needs to build tools, and workspace for source code , so it puts performance overload in the system , if the Jenkins master crashes , it increases the downtime of your build and release cycle.

Q51. Why devops?

Answer: DevOps is the market trend now, which follows a systematic approach for getting the application live to market. DevOps is all about tools which helps in building the development platform as well as production platform. Product companies are now looking at a Code as a service concept in which the development skill is used to create a production architecture with atmost no downtime.

Q52. Why Ansible?

Answer: A Configuration Management tool which is agentless. It works with key based or password based ssh authentication. Since it is agentless, we have the complete control of the manipulating data. Ansible is also use for architecture provisioning as it has modules which can talk to major cloud platforms. I have mainly used for AWS provisioning and application/system config manipulations.

Q53. Why do you think a Version control system is necessary for DevOps team?

Answer: Application is all about code, if the UI is not behaving as expected, there could be a bug in the code. Inorder to track the code updates, versioning is a must.
By any chance if bug breaks the application, we should be able to revert it to the working codebase. Versioning helps to achieve this.
Also, by keeping a track of code commits by individuals, it is very easy to find the source of the bug in the code.

Q54. What role would you prefer to be in the DevOps team?

Answer: Basically the following are prominent in DevOps depending upon the skillset.
1. Architect
2. Version Control Personnel
3. Configuration control Team
4. Build and Integration management
5. Deployment Team.
6. Testing People
7. QA

Q55. Architecture Monitoring Team

Answer: In my opinion, everyone should owe to be an architech. with this course, I will be fir the role from 2 to 5. Everyone should understand the working of each role. Devops is a collective effort rather individual effect.

Q56. Suppose you are put in to a project where you have to implement devops culture, what will be your approach?

Answer: Before thinking of DevOps, there should be a clear cut idea on what need to be implement and it should be done by the Senior architect.
If we take a simple example of shopping market :
Output of this business will be a website which displays online shopping items, and a payment platform for easy payment.
Even though it looks simple, the background work is not that easy, because a shopping cart must be :
– 99.99% live
– Easy and fast processing of shopping items
– Easy and fast payment system.
– Quick reporting to shopkeeper
– Quick Inventory Management
– Fast customer interaction
and many more
DevOps has to be implement in each process and phase. Next is the tools used for bringing the latest items in website with minimal time span. Git, Jenkins, Ansible/Chef, AWS can be much of familiar tools with helps in continuous delivery to market.

Q57. Whether continuous deployment is possible practically?

Answer: Ofcourse it is possible if we bring the Agility in every phase of development and deployment. The release, testing and deployment automation should be so accurately finetuned

Q58. What is Agility in devops basically?

Answer: Agile is an iterative form of process which finalizes the application by fulfilling the checklist. For any process, there should be set of checklist inorder to standardize the code as well as the build and deployment process. The list depends on the architecture of the application and business model.

Q59. Why scripting using Bash, Python or any other language is a must for a DevOps team?

Answer: Even though we have numerous tools in devops, but there will certain custom requirements for a project. In such cases, we have to make use of scripting and then integrate it with the tools.
Q60. In AWS, how do you implement high availability of websites?
The main concept of high availability is that the website should be live all the time. So we should avoid single point of failure, inorder to achieve this LoadBalancer can be used. In AWS, we can implement HA with LB with AutoScaling methods.

Q61.How to debug inside a docker container ?

Answer: The feature “docker exec” allows users to debug a container

Q62.What do you mean by Docker Engine ?

It is open source container build and management tool

Q63.Why we need Docker?

Answer: Applications were started to use Agile methodology where they build and deployed iteratively .
Docker helps is deploying same binaries with dependencies across different environments with fraction of seconds

Q64.What do you mean by Docker daemon ?

Answer: Docker Daemon Receives and processes incoming API requests from the CLI .

Q65.What do you mean by Docker client ?

Answer: Command line tool – which is a docker binary and it communicate to the Docker daemon through the Docker API.

Q66.what do you mean by Docker Hub Registry ?

Answer: It is a Public image registry maintanined by Docker itself and the Docker daemon talks to it through the registry API

Q67.How do you install docker on a debian Linux OS ?

Answer: sudo apt-get install docker.io

Q68.What access does docker group have ?

Answer: The docker user have root like access and we should restrict access as we would protect root

Q69.How to list the packages installed in Ubuntu container ?

Answer: dpkg -l lists the packages installed in ubuntu container

Q70.How can we check status of the latest running container?

Answer: With “docker ps -l” command list latest running processes

Q71.How to Stop a container?

Answer: “docker kill “command to kill a container

“docker stop “command to stop a container

Q72.How to list the stopped containers?

Answer: docker ps -a ( –an all)

Q73.What do you mean by docker image?

Answer: An image is a collection of files and its meta data , basically those files are the root filesystem of the container

Image is made up of layers where each layer can be edited

Q74.What is the differences between containers and images

Answer: An image is an read-only filesystem where container is a running form of an image .
Image is non-editable and on containers we can edit as we wish & save that again to a new image

Q75.How to do changes in a docker image?

Answer: No we can’t do changes in an image. we can make changes in a Dockerfile or to the existing container to create a layered new image

Q76.Different ways to create new images ?

Answer: docker commit: to create an image from a container
docker build: to create an image using a Dockerfile

Q77.Where do you store and manage images?

Answer: Images can be stored in your local docker host or in a registry .

Q78.How do we download the images?

Answer: Using “docker pull” command we can download a docker image

Q79. What are Image tags?

Answer: Image tags are variants of Docker image . “latest” is the default tag of an image

Q80.What is a Dockerfile.?

Answer: A Dockerfile series of instructions to build a docker image

Docker build command can be used to build

Q81.How to build a docker file?

Answer: docker build -t <image_name>

Q82.How to view hostory of a docker image?

Answer: The docker history command lists all the layers in an image with image creation date, size and command used

Q83.What are CMD and ENTRYPOINT?

Answer: These will allow using the default command to be executed when a container is starting

Q84.EXPOSE instruction is used for?

Answer: The EXPOSE command is used to publish ports of a docker container

Q85.What is Ansible?

Answer: A configuration management tool similar to a puppet , chef etc .

Q86.Why to choose Ansible?

Answer: Ansible is simple and light where it needs only shh and python as a dependency .

It doesnt required an agent to be installed

Q87.What are the ansible modules?

Answer: Ansible “modules” are pre-defined small set of codes to perform some actions

eg: copy a file, start a service

Q88.What are Ansible Tasks ?

Answer: Tasks are nothing but ansible modules with the arguments

Q89.What are Handlers in ansible?

Answer: Handlers are triggered when there is need in change of state

e.g.restart service when a property file have changed.

Q90.What are Roles in ansible?

Answer: Roles are re-usable tasks or handlers.

Q91.What is YAML?

Answer: YAML – yet another markup language is way of storing data in a structured text format like JSON

Q92.What are Playbooks ?

Answer: Playbooks are the recipes to ansible

Q93.What is MAVEN ?

Answer: Maven is a Java build tool, so you must have Java installed to proceed.

Q94.What do you mean by validate in maven ?

Answer: Validate is to check whether the info provided are correct and all necessary is available

Q95.What do you mean by compile in maven?

Answer: It is to compile the source code of the project

Q96.What do you mean by test in maven?

Answer: It is to test the source code to test using suitable testing framework

Q97.What do you mean by package in maven?

Answer: It is to do the binary packaging of the compiled code

Q98.What is docker-compose?

Answer: Compose is to define and run a multi-container application

Q99.What is Continuous integration?

Answer: CI is nothing but giving immediate feedback to the developer by testing , analyzing the code .

Q100. What is Continuous delivery?

Answer: Continuous delivery is a continuation of CI which aims in delivering the software until pre -prod automatically

Q101.What is Continuous deployment?

Answer: Continuous deployment is next step after CI and CD where the tested software will be provide to the end customers post some validation and change management activities

Q102.What is git?

Answer: git is a source code version management system .

Q103.What is git commit?

Answer: git commit records changes done to file in the local system.

Q104.what is git push?

Answer: git push is to update the changes to the remote repository in the internet .

Q105.What’s git fetch?

git fetch will pull only the data from the remote repo but doesnt merge with the repo in your local system.

Q106.What is git pull?

Answer: git pull will download the files from the remote repo and will merge with the files in your local system.

Q107.How to reset the Last git Commit ?

Answer: “git reset” command can be used to undo last commit .

Q108.What is the need for DevOps ?

Answer: Start the answer by explaining general market trend, how releasing small features benefits compared to releasing big features, advantages of releasing small features in high frequency.

Discuss about the topics such as

  • Increase deployment frequency
  • Lower failure rate of newer releases
  • Reduced time for bug fixes
  • Time to recovery
Q109. Write the key components of DevOps?

Answer: These are te key comonents of DevOps.

  • Continuous Integration
  • Continuous Testing
  • Continuous Delivery
  • Continuous Monitoring
Q110. What are the various tools used in DevOps?

Answer: DevOps contains various stages. Each stage can be achieved with various tools. Below are the various tool that are popularly used tools in DevOps.

  • Version Control : Git , SVN
  • CI/CD : Jenkins
  • Configuration Management Tools : Chef, Puppet, Ansible
  • Containerization Tool : Docker

Also mention any other tools that you worked on that helped you to automate the existing environment

Q111. What is Version Control?

Answer: Version Control System (that are made to the files or documents over a period of time.

Q112. What are the types of Version Control Systems?

Answer: There are two types of Version Control Systems:

  • Central Version Control System, Ex: Git, Bitbucket
  • Distributed/Decentralized Version Control System
Q113. What is jenkins?In jenkins, what is the programming language should be used?

Answer: It is a open Source automation tool. it is a pupose of Continuous Integration and Continuous Delivery.

Jenkins is a written in java Programming language.

Q114. Give an explanation about DevOps.

Answer: DevOps is nothing but a practice that emphasizes the collaboration and communication of both software developers and implementation team. It focuses on delivering software product faster and lowering the failure rate of releases.

Q115. What are the key Principles or Aspects behind DevOps?

Answer: The key Principles or Aspects are

  • Infrastructure as code
  • Continuous deployment
  • Automation
  • Monitoring
  • Security
Q116. Describe the core operations of DevOps with Infrastructure and with application.

Answer: The core operations of DevOps are

Infrastructure

  • Provisioning
  • Configuration
  • Orchestration
  • Deployment

Application development

  • Code building
  • Code coverage
  • Unit testing
  • Packaging
  • Deployment
Q117. How “Infrastructure code” is processed or executed in AWS?

Answer: In AWS,

Infrastructure code will be in simple JSON format
After that JSON code will be organized into files called templates
You can Implement the templates on AWS DevOps and then managed as stacks
At last the creating, deleting, updating, etc. operation in the stack are done by Cloud Formation

Q118. Which scripting language is most important for a DevOps engineer?

Answer: It is very important to choose the simplest language for DevOps engineer. Python Language is most suitable language for DevOps.

Q119. How DevOps helps developers?

Answer: Developers can fix bug and implement new features with less time by the help of DevOps. DevOps can also help to build a perfect communication system in a team with every team member.

Q120. Which are popular tools for DevOps?

Answer: Popular tools for DevOps are

  • Jenkins
  • Nagios
  • Monit
  • ELK (Elasticsearch, Logstash, Kibana)
  • Jenkins
  • Docker
  • Ansible
  • Git
Q121. What is the usefulness of SSH?

Answer: SSH is used to log into a remote machine and work on the command line and also used it to dig into the system to make possible secure coded communications between two untrusted hosts over an insecure network.

Q122. How you would handle revision (version) control?

Answer: I will post the code on SourceForge or GitHub to give avisual for everyone. I will post the checklist also from the last revision to make sure that any unsolved issues are resolved.

Q123. How many types of Http requests are?

Answer: The types of Http requests are

  • GET
  • HEAD
  • PUT
  • POST
  • PATCH
  • DELETE
  • TRACE
  • CONNECT
  • OPTIONS
Q124.If a Linux-build-server suddenly starts getting slow what will you check?

Answer: If a Linux-build-server suddenly starts getting slow, I will check for following three things

Application Level troubleshooting: Issues related with RAM, Issues related with Disk I/O read write, Issues related with Disk space, etc.

System-Level troubleshooting: Check for Application log file OR application server log file, system performance issues, Web Server Log – check HTTP, tomcat log, etc. or check jboss, WebLogic logs to see if the application server response/receive time is the issues for slowness, Memory Leak of any application

Dependent Services troubleshooting: Issues related with Antivirus, Issues related with Firewall, Network issues, SMTP server response time issues, etc

Q125. Describe the key components of DevOps.

The most important DevOps components are:

  • Continuous Integration
  • Continuous Testing
  • Continuous Delivery
  • Continuous Monitoring
Q126. Give example of some popular cloud platform used for DevOps Implementation.

Answer: For DevOps implementation popular Cloud platforms are:

  • Google Cloud
  • Amazon Web Services
  • Microsoft Azure
Q127. Describe benefits of using Version Control system.

Answer: Version Control system gives scope to team members to work on any file at suitable time.
All the previous versions and variants are closely packed up inside the VCS.
You can use distributed VCS to store the complete project history in case central server breakdown you can use your team member’s file location storage related with the project.
You can see the actual changes made in the file’s content.

Q128. How Git Bisect helps?

Answer: Git bisect helps you to find the commit which introduced a bug using binary search.

Q129. What is the build?

Answer: Build is a method in which you can put source code together for checking that is the source code working as a single unit. In the build creation process, the source code will undergo compilation, inspection, testing, and deployment.

Q130. What is Puppet?

Answer: Puppet is a project management tool which helps you to convert the administration tasks automatically.

Q131.What is two-factor authentication?

Answer: Two-factor authentication is a security method in which the user provides two ways of identification from separate categories.

Q132. What is ‘Canary Release’?

Answer: It is a pattern which lowers the risk of new version software introduction into the production environment. User will get “Canary Release” in a controlled manner before making it available to the complete user set.

Q133.What are the important types of testing required to ensure new service is ready for production?

Answer: You need to run continuous testing to make sure the new service is ready for production.

Q134. What is Vagrant?

Answer: Vagrant is a tool used to create and manage a virtual version of computing environments for tests and software development.

Q135. Usefulness of PTR in DNS.

Answer: PTR or Pointer record is used for reverse DNS lookup.

Q136. What is Chef?

Answer: Chef is a powerful automation platform used for transforming infrastructure into code. In this tool, you can use write scripts that are used to automate processes.

Q137. Prerequisites for the implementation of DevOps.

Answer: Following are the useful prerequisites for DevOps Implementation:

  • At least one Version Control Software (VCS).
  • Establish communication between the team members
  • Automated testing
  • Automated deployment
Q138. For DevOps success which are the best practices?

Answer: Here, are essential best practices for DevOps implementation:

  • The speed of delivery means time taken for any task to get them into the production environment.
  • Track the defects are found in the various
  • It’s important to calculate the actual or the average time taken to recover in case of a failure in the production environment.
  • Get a feedback from the customer about bug report because it also affects the quality of application.
Q139. How SubGit tool helps?

Answer: SubGit helps you to move SVN to Git. You can build a writable Git mirror of a local or alien to Subversion repository by using SubGit.

Q140. Name some of the prominent network monitoring tools.

Answer: Some most prominent network monitoring tools are:

  • Splunk
  • Icinga 2
  • Wireshark
  • Nagios
  • OpenNMS
Q141. How do you know if your video card can run Unity?
Answer: When you use a command
1 /usr/lib/Linux/unity_support_test-p
it will give detailed output about Unity’s requirements, and if they are met, then your video card can run unity.

Q142. How to enable startup sound in Ubuntu?

Answer: To enable startup sound

Click control gear and then click on Startup Applications
In the Startup Application Preferences window, click Add to add an entry
Then fill the information in comment boxes like Name, Command, and Comment

1 /usr/bin/canberra-gtk-play—id= “desktop-login”—description= “play login sound”

Logout and then login once you are done

You can use shortcut key Ctrl+Alt+T to open .

Q143. Which is the fastest way to open an Ubuntu terminal in a particular directory?

Answer: To open an Ubuntu terminal in a particular directory, you can use custom keyboard short cut.

To do that, in the command field of a new custom keyboard, type genome – terminal – – working – directory = /path/to/dir.

Q144. How could you get the current colour of the current screen on the Ubuntu desktop?

Answer: You have to open the background image in The Gimp (image editor) and use the dropper tool to select the colour on a selected point. It gives you the RGB value of the colour at that point.

Q145. How can you create launchers on a desktop in Ubuntu?

Answer: You have to use ALT+F2 then type” gnome-desktop-item-edit –create-new~/desktop,” it will launch the old GUI dialog and create a launcher on your desktop in Ubuntu.

Q146. Explain what Memcached is?

Answer: Memcached is an open source and free, high-performance, distributed memory object caching system. The primary objective of Memcached is to increase the response time for data otherwise it can be recovered or constructed from some other source or database. Memcached is used to reduce the necessity of SQL database operation or another source repetitively to collect data for a simultaneous request.

Memcached can be used for

  • Social Networking->Profile Caching
  • Content Aggregation -> HTML/ Page Caching
  • Ad targeting -> Cookie/profile tracking
  • Relationship -> Session caching
  • E-commerce -> Session and HTML caching
  • Location-based services -> Database query scaling
  • Gaming and entertainment -> Session caching

Memcache helps in

  • Make application processes much faster
  • Memcached make the object selection and rejection process
  • Reduce the number of retrieval requests to the database
  • Cuts down the I/O ( Input/Output) access (hard disk)

Drawback of Memcached is

  • It is not a preserving data store
  • Not a database
  • It is not an application specific
  • Unable to cache large object
Q147. Mention some important features of Memcached?

Answer: Important features of Memcached includes

  • CAS Tokens: A CAS token is attached to an object retrieved from a cache. You can use that token to save your updated object.
  • Callbacks: It simplifies the code
  • getDelayed: It decrease the time consumption of your script, waiting for results to come back from a server
  • Binary protocol: You can use binary protocol instead of ASCII with the newer client
  • Igbinary: A client always has to do serialization of the value with complex data previously, but now with Memcached, you can use igbinary option.
Q148. Is it possible to share a single instance of a Memcache between multiple projects?

Answer: Yes, it is possible to share a single instance of Memcache between multiple projects. You can run Memcache on more than one server because it is a memory store space. You can also configure your client to speak to a particular set of case. So, you can run two different Memcache processes on the same host independently.

Q149. You are having multiple Memcache servers, one of the memcache servers fails, and it has your data, can you recover key data from the perticular failed server?

Answer: Data won’t be removed from the server but there is a solution for auto-failure, which you can configure for multiple nodes. Fail-over can be triggered during any socket or Memcached server level errors and not during standard client errors like adding an existing key, etc.

Q150. How can you minimize the Memcached server outages?

Answer:

  • If you write the code to minimize cache stampedes then it will leave a minimal impact
  • Another way is to bring up an instance of Memcached on a new machine using the lost machines IP address
  • The code is another option to minimize server outages as it gives you the liberty to change the Memcached server list with minimal work
  • Setting timeout value is another option that some Memcached clients implement for Memcached server outage. When your Memcached server goes down, the client will keep trying to send a request till the time-out limit is reached
Q151. How can you update Memcached when data changes?

Answer: When data changes you can update Memcached by

Clearing the Cache proactively: Clearing the cache when an insert or update is made
Resetting the Cache: this method is similar with previous one but without delete the keys and wait for the next request for the data to refresh the cache, reset the values after the insert or update.

Q152. What is Dogpile effect? What is the prevention of this effect?

Answer: When a cache expires, and websites are hit by the multiple requests made by the client at the same time the Dogpile effect occurs. You have to use semaphore lock to prevent the effect. In this system after value expires, the first process acquires the lock and starts generating new value.

Q153. How Memcached should not be used?

Answer:

  • You have to use Memcached as cache; don’t use it as a data store.
  • Don’t use Memcached as the ultimate source of information to run your application. You must always have an option of data source in your hand.
  • Memcached is basically a value store and can’t perform a query over the data or go through again over the contents to extract information.
  • Memcached is not secure either in encryption or authentication.
Q154. When a server gets shut down does data stored in Memcached is still available?

Answer: No after a server shuts down and then restart the stored data in Memcached will be deleted because Memcached is unable to store data for long time.

Q155. What are the difference between Memcache and Memcached?

Answer:

  • Memcache: It is an extension that allows you to work through handy object-oriented (OOP’s) and procedural interfaces. It is designed to reduce database load in dynamic web applications.
  • Memcached: It is an extension that uses the libmemcached library to provide API for communicating with Memcached servers. It is used to increase the dynamic web applications by reducing database load. It is the latest API.
Q156. Explain Blue/Green Deployment Pattern

Answer: Blue/Green colouring pattern is one of the hardest challenge faced at the time of automatic deployment process. In Blue/ Green Deployment approach, you need to make sure two identical production environments. Only one among them is LIVE at any given point of time and it is called Blue environment.

After take the full preparation to release the software the team conducts the final testing in an environment called Green environment. When the verification is complete the traffic is routed to the Green environment.

Q157. What are the containers?

Answer: Containers are from of lightweight virtualization and create separation among process.

Q158. What is post mortem meeting with reference to DevOps?

Answer: In DevOps Post mortem meeting takes place to discuss about the mistakes and how to repair the mistakes during the total process.

Q159. What is the easiest method to build a small cloud?

Answer: VMfres is one of the best options to built IaaS cloud from Virtual Box VMs in lesser time. But if you want lightweight PaaS, then Dokku is a better option because bash script can be PaaS out of Dokku containers.

Q160. Name two tools you can use for docker networking.

Answer: You can use Kubernetes and Docker swarm tools for docker networking.

Q161. Name some of DevOps Implementation area

Answer: DevOps are used for Production, Production feedback, IT operation, and its software development.

Q162. What is CBD’?

Answer: CBD or Component-Based Development is a unique way to approach product development. In this method, Developers don’t develop a product from scratch, they look for existing well defined, tested, and verified components to compose and assemble them to a product.

Q163. Explain Pair Programming with reference to DevOps

Answer: Pair programming is an engineering practice of Extreme Programming Rules. This is the process where two programmers work on the same system on the same design/algorithm/code.

They play two different roles in the system. One as a“driver” and other as an “observer”. Observer continuously observes the progress of a project to identify problems. They both can change their roles in a step of the program.

Besant Technologies – Chennai & Bangalore Branch Locations

Besant Technologies - Velachery Branch

Plot No. 119, No.8, 11th Main road, Vijaya nagar,

Velachery, Chennai - 600 042

Tamil Nadu, India

Landmark - Reliance Digital Opposite Street

Besant Technologies - Tambaram Branch

No.2A, 1st Floor, Duraisami Reddy Street,

West Tambaram, Chennai - 600 045

Tamil Nadu, India

Landmark - Near By Passport Seva

Besant Technologies - OMR Branch

No. 5/318, 2nd Floor, Sri Sowdeswari Nagar,

OMR, Okkiyam Thoraipakkam, Chennai - 600 097

Tamil Nadu, India

Landmark - Behind Okkiyampet Bus Stop,

Besant Technologies - Porur Branch

No. 180/84, 1st Floor, Karnataka Bank Building,

Trunk Road, Porur, Chennai - 600 116

Tamil Nadu, India

Landmark - Opposite to Gopalakrishna Theatre

Besant Technologies - Anna Nagar Branch

Plot No:1371, 28th street kambar colony,

I Block, Anna Nagar, Chennai - 600 040

Tamil Nadu, India

Landmark - Behind Reliance Fresh

Besant Technologies - T.Nagar Branch

Old No:146/2- New No: 48,

Habibullah Road,T.Nagar, Chennai - 600 017

Tamil Nadu, India

Landmark - Opposite to SGS Sabha

Besant Technologies - Thiruvanmiyur Branch

22/67, 1st Floor, North mada street, Kamaraj Nagar

Thiruvanmiyur, Chennai 600041

Tamil Nadu, India

Landmark - Above Thiruvanmiyur ICICI Bank

Besant Technologies - Siruseri Branch

No. 4/76, Ambedkar Street, OMR Road, Egatoor, Navallur,

Siruseri, Chennai 600130

Tamil Nadu, India

Landmark - Near Navallur Toll Gate, Next to Yamaha Showroom

Besant Technologies - Maraimalai Nagar Branch

No.37, Ground Floor, Thiruvalluvar Salai,

Maraimalai Nagar,Chennai 603209

Tamil Nadu, India

Landmark - Near to Maraimalai Nagar Arch

Besant Technologies - BTM Layout Branch

No 2, Ground floor, 29th Main Road,

Kuvempu Nagar,BTM Layout 2nd Coming from Silkboard,

AXA company signal, Stage, Bangalore - 560 076

Karnataka, India

Landmark - Next to OI Play School

Besant Technologies - Marathahalli Branch

No. 43/2, 2nd Floor, VMR Arcade,

Varthur Main Road, Silver Springs Layout,

Munnekollal, Marathahalli, Bengaluru - 560037

Karnataka, India

Landmark - Near Kundalahalli Gate Signal

Besant Technologies - Rajaji Nagar Branch

No. 309/43, JRS Ecstasy, First Floor,

59th Cross, 3rd Block, Bashyam Circle,

Rajaji Nagar, Bangalore - 560 010

Karnataka, India

Landmark - Near Bashyam Circle

Besant Technologies - Jaya Nagar Branch

No. 2nd Floor,1575,11th Main Road,

4th T-Block, Pattabhirama Nagar,

Jaya Nagar, Bangalore - 560 041

Karnataka, India

Landmark - Opposite to Shanthi Nursing Home

Besant Technologies - Kalyan Nagar Branch

No.513, 4th Cross Rd

2nd Block, HRBR Layout,

Kalyan Nagar, Bangalore - 560 043

Karnataka, India

Landmark - Opposite to kalayan nagar Axis Bank

Besant Technologies - Electronic City Branch

No. 7, 3rd Floor, Ganga Enclave,

Neeladri Road, Karuna Nagar, Doddathoguru Village,

Electronics City Phase 1, Electronic City,

Bangalore - 560100, karnataka, India

Landmark - Adjacent to HDFC Bank and State Bank Of India

Besant Technologies - Indira Nagar Branch

No.54, 1st Floor,

5th Main kodihalli,

Bengaluru, Karnataka 560008, India

Landmark - Behind Leela Palace Hotel,

Besant Technologies - HSR Layout Branch

Plot No. 2799 & 2800, 27th Main,

1st Sector, HSR Layout,

Bengaluru, Karnataka 560102, India

Besant Technologies - Hebbal Branch

No.29, 2nd Floor, SN Complex,

14th Main Road, E-Block Extention, Sahakara Nagar,

Bengaluru, Karnataka -560092, India

Scroll Up