MVC Architecture in Web Applications Development

MVC Architecture in Web Applications Development


A leading provider of publicly recorded tax, deed, mortgage and foreclosure data along with proprietary neighborhood and parcel-level risk data for more than 150 million U.S. properties. Also they provide all types of foreclosure listings (pre-foreclosure, auction, bank-owned) as well as current for sale and recently sold properties in 2,200 counties across the nation.


1. Our application needs asynchronous communication on the backend.
2. Manipulation of data is mostly on the client side (browser).
3. Same type of data is being delivered in different ways on a single page (navigation).


We have implemented each module separately with smaller ASP.NET MVC project. In order to provide more flexibility in the module development, we have rewritten the MVC render engine methods from page view rendering flow to the system page controller processing flow. Also We have used many jQuery plugins(like jQuery UI, jqGird, ajax file uploader, etc)to implement the system UI and used the jQuery JSON technique to improve the interaction between client and service.

Technology and Tools:

Web Technologies: .Net MVC Framework, HTML, CSS styles, Jquery, Ajax, MSSQL


a. MVC architecture helps us to control the complexity of application by dividing it into three components i.e. model, view and controller.
b. Test driven development approach is supported by MVC architecture.
c. Front controller provides support rich routing communications to design our web application.


Our applications are unlike a normal web page, they tend to feature more user interaction as well as needing to communicate with a backend server in real time. Now we were to handle this behavior with an MVC framework more efficiently and we came to structured, maintainable and testable code. Also this makes our application scalable, transparent and portable.

Benefits by implementing ISO 27001:2013

Benefits by implementing ISO 27001:2013

ISO 27001:2013 helped us in understanding the risk inside and adapting it to build a framework that enables our security policies and controls, enforces people & processes to identify and protect the information and ensure IT governance.

Information Security Challenges:

• Due to advance in technology and communication it is increasingly difficult to ensure that information is provided in such a way that its integrity is ensured.
• In information sharing environment awareness of information is paramount. The information that is sensitive or confidential in nature must be kept private.
• ISO/IEC 27001:2013 is the only auditable standard which defines the requirements for an Information Security Management System (ISMS). The standard is designed to ensure the selection of adequate and proportionate security controls.

Customer Needs:

Majority of our customers are related to finance, mortgage and real-estate, satellite and telecom, healthcare & government segments that are very much sensitive to information & cyber security, data privacy & sharing norms. Trusted sites & security is going to play an important role in the digital age where data protection & privacy have compliances to meet and making excellence a habit.

Our Solutions:

• We systematically evaluate our information security risks, taking into account the impact of threats and vulnerabilities.
• We have an overarching management process to ensure that the information security controls meet our needs on an ongoing basis.
• Information security audit is a systematic, measurable technical assessment of how the organization's security policy is employed and is performed on quarterly basis.
• Security audits provide a fair and measurable way to examine how secure an organization really is.

Customer Benefits derived from ISMS:

• Customers can feel confident on their commitment to keeping their information safe.
• Set them apart from their competitors and in the marketplace.
• Ensured their compliance to ISO/IEC 27001:2015 standard.
• Reduced the risk of cost of single or several security breaches.
• Ensured a more dependable availability of both hardware and data.
• Improved employee awareness of security issues and their responsibility within the organization.

Making Responsive Website

Making Responsive Website


The Client is a multi-sourced national property data warehouse that contains tax, deed, mortgage, foreclosure, environmental risk, natural hazard, health hazard, neighborhood characteristics, and property characteristic data for over 155 million U.S. properties, delivering actionable data to customers.


Converting existing non-responsive website to responsive website for various resolutions(May it be on a Desktop, Mobile, or Tablet). Without changing the existing behaviour, navigation, content or images. And also without changing the existing CSS files.


Web Technologies: HTML, Bootstrap 4, CSS 3


Using Bootstrap 4, we converted the non-responsive website to responsive website for the resolution ranging from 320px to 2580px.


1. Cost effectiveness:
You will only need to invest in a single site design to appeal to all visitors and all devices.

2. Save time on site management:
Clients will also find it much easier and less time consuming to manage and maintain a single site

3. Flexibility:
When you have a website with responsive design, you can make changes quickly and easily. You do not need to worry about making changes on two websites. This flexibility is a huge advantage when you just want to make a quick design tweak or fix a typo on your site—you only have to do it once.

4. Improved user experience:
User experience is crucial to website owners. You want people to like your site, and you want it to be easy to use to convince them to come back. If someone visits your website on a mobile device, and it takes forever to load or your pictures do not have the proper resolution, it can make your company appear unprofessional. No one wants to do business with a place that is unprofessional. But responsive design, which offers a much better user experience, can help convince people to give your company a chance. Because zooming and scrolling will be eliminated, content can be viewed quicker, and the overall impression that visitors have will be much more positive.



About SWIFT:

Swift is a fantastic way to write software, whether it’s for phones, desktops, servers, or anything else that runs code. It’s a safe, fast, and interactive programming language that combines the best in modern language thinking with wisdom from the wider Apple engineering culture and the diverse contributions from its open-source community. The compiler is optimized for performance and the language is optimized for development, without compromising on either.


Swift defines away large classes of common programming errors by adopting modern programming patterns:

• Variables are always initialized before use.
• Array indices are checked for out-of-bounds errors.
• Integers are checked for overflow.
• Optionals ensure that nil values are handled explicitly.
• Memory is managed automatically.
• Error handling allows controlled recovery from unexpected failures.


On June 2nd, 2014, Apple unveiled a brand – new object-oriented programming language: Swift, a replacement for Objective-C, which had been up to that point the standard programming language for OS X and iOS application development. But Apple hasn’t omitted Objective –C, in fact they have given option to Developers to choose the language in which they want to develop the Software. This has been the most challenging thing that many iOS developers are facing (which one to choose).

Overcome the Challenge:

If we are not thoroughly following latest tech trends and beta releases, adopting to new programming language all of a sudden is very much difficult. Apple has made Beta releases available to developers, so that they will be able to get good hands on experience by the time actual version gets release.

There is a famous saying by Steve Jobs related to technology growth

"You can't just ask customers what they want and then try to give that to them. By the time you get it built, they'll want something new."


Swift was built with performance in mind. Not only does its simple syntax and hand-holding help you develop faster.
Better performance equals better apps


Following and using the latest technologies and resources is key to success for any Business/Organization.

Failover management using NC clustering

Failover management using NC clustering

Client Situation:

Network Controller comes into picture when we have a huge number of users trying to access to the internet. Network Controller is a node that controls the traffic between the users trying to connect to the internet and gives proper bandwidth. The problem arises when there comes a situation where the Network Controller fails to do so. So, in order to do failover management, we use the concept of Network Controller clustering as a high availability feature.
Our QA consultants undertook an assessment initiative to identify the current challenge and design the optimal solution for the client’s needs.

Challenges faced by us:

At the beginning of the engagement, we were required to address these issues:

• Configuration limitation of bringing multiple Network Controllers together in a cluster form.
• Identifying the right automation tool and configuration method to handle the failover of the Network Controller.

Our Solution:

To overcome the above challenge, we designed and implemented feasible test strategy:

• By using the concept of Network Controller clustering and using three Network Controllers instead of using one Network Controller and by configuring the same hosts in those three Network Controllers.
• By using the leader follower method. Which means that in the three Network Controllers, we assign a leader and two followers. If failover happens, the control moves from the leader Network Controller to one of the follower Network Controllers and the functionality of the entire setup doesn’t get tampered with and the traffic is handled efficiently.

How DT delivered business value:

Failover handling which helped 90% avoiding interruption in network availability:

• Reduced the chances on Business Impact to the customers.
• Made the systems more fault-tolerant.
• Highly reduced the manual efforts on failover handling situations.

How clients benefitted from it:

• Avoided the chances of getting issue in availability of network for their end customers and this improved their business.

Various Technologies/Tools we use:

• Ansible playbooks and Jenkins.

Application build and deployment through CI/CD

Application build and deployment through CI/CD


• CI stands for Continuous Integration, is a software development practice in which all developers merge code changes in a central repository multiple times a day.
• CD stands for Continuous Delivery, which on top of Continuous Integration adds the practice of automating the entire software release process.
• A CI/CD pipeline helps you automate steps in your software delivery process:-
- Initiating code builds.
- Running automated tests.
- Deploying to a staging or production environment.
• Automated pipelines remove manual errors, provide standardized development feedback loops and enable fast product iterations.


• Automating the source code build when the developer releases the new version of the source code
• Checkout details of the source code happening in the Job 1 to be available in the Job 2 while sending the consolidated build report.

How did we overcome?

• We have added new version.txt file along with the source code, and with the release of a new source code version Dev team will be the updating the version.txt before the commit to GitHub. We have used the “Polling ignores commit in certain path” feature in Jenkins to perform the build only when a commit happens to the version.txt file.
• With the Parameterized build option in Jenkins we were able to find the solution for the 2nd challenge we have faced during the implementation. We passed the GIT_COMMIT env variable data in the Job1 as a parameter to the 2nd job which satisfied the requirement.
• We have analyzed the existing workflow of project to understand how the build and deployment process is happening manually.
• Did some research on suitable DevOps tools that will be needed for automating the project workflow.
• We have implemented the tools in the project environment to achieve the required functionality. Also, we have done multiple level of testing to make sure everything works as per the expectations.

Benefits we provided to the development team:

After implementation of the current CI/CD pipeline for the deployment process:-

• The development team has been happy with the automated build process and also easy to handle the application deployments without putting lot of efforts.
• The application is stable and also minimized the application downtime after implementing the pipeline deployment process.

Various technologies and tools we use for implementation:

Scripting languages- shell script
Tools- 1. Jenkins (Continuous integration tool)
2. Maven (Application code build tool)
3. Docker (containerization Tool)
4. Dockerhub
5. GitHub (code repository)

Network Speed Test Tool Using Selenium-Python

Network Speed Test Tool Using Selenium-Python

Getting various network speed parameters for streaming online videos and loading websites for desktop and mobile for a leading US based telecommunications company

Client Situation:

As part of client initiative to enhance the internet service efficiency to its customers, there were good number of enhancements made in the application. In order verify if the end user experience has improved after these changes, the client needed us to develop a tool that can capture various network parameters while streaming a video or a website over the internet. The tool kit was expected to be inputted with the URL and network parameters to be captured and was expected to simulate this either in a Laptop or Mobile over different platforms like windows, Linux, Mac, Android and iOS.

Based on the expected parameter values set, the tool kit was supposed to indicate if the expected results are achieved.

Our Solution:

• We Used Selenium framework with python as a scripting language to develop different functions which will return the various network parameters. We created various libraries for different websites which would stream videos, load websites and give details about the various network speed related parameters for desktop environment.
• Also we did the same libraries for mobile network speed testing using a tool called “Appium” which is similar to selenium, which would allow us to simulate a mobile device and run all the tests. We also added more libraries for initiating Skype calls and checking the network stability.
• These tools not only gives, upload and download speeds like other speed test tools but also gives you other network parameters such as streaming videos buffer health, frame drop count, DNS resolution time, website connection time and lot more.

How DT delivered business value:

• We used open source technology to develop the tool which was cost effective.
Identified network issues and stability of network more accurately.
• As the tool was giving precise values, which indeed helped our client to modify and configure their network as per the accurate network speed parameter results in order to provide better internet experience to the customers.

Various Technologies and Tools we use for Automation:

Scripting languages: Python Tools: Selenium-Webdriver, Appium, Andriod SDK

Multi-Tenancy using Django

Multi-Tenancy using Django

Implementation of Django framework using Python for SaaS(Software as a Service) Application

The term "Software Multi-tenancy" refers to software architecture in which a single instance of the software runs on a server and serves multiple tenants. Systems designed in such manner are often called shared. A tenant is a group of users who share a common access with specific privileges to the software instance. With a multitenant architecture, a software application is designed to provide every tenant a dedicated share of the instance including its data, configuration, user management, tenant individual functionality and non-functional properties. Multi-tenancy contrasts with multi-instance architectures, where separate software instances operate on behalf of different tenants.


One of the main challenges faced is that we wanted to split the software among several clients in a way that the data is accessed separately for each of them. We wanted to give each client a separate access to the application as if they feel that they are using a website of their own. Implementation of Multi-tenancy using previous technology is complex so we opted for Django which provides easy implementation of Multi-tenancy feature.

Implementation Strategy used:

The approach used is: Single database and multiple schemas, where
• Each tenant (customer) will have their own schema within the database.
• Each schema will have a set of tables related to a single tenant (customer).
• If any data related to one particular tenant is updated then this update will be reflected in the schema of that tenant.

Technology :

Django framework is used to implement Multi-tenancy feature using Python. Django REST Framework is used for building web APIs because of its Simplicity, flexibility, quality and security.

Business Benefits:

• Easy to Add, Delete or modify the tenant details in the database.
• Provides data privacy between tenants.
• Reduces the complexity of server infrastructure, and thereby the cost.

Unlock the Power of Salesforce CRM

Unlock the Power of Salesforce CRM


It’s exciting when organizations grow, however, growth can also cause an array of challenges.

• Higher operational cost and maintenance involved in existing spreadsheet-based approach for managing customer support queries proving increasingly problematic.
• Integrating multiple applications On Demand and Local server to align key business functions in support of sales.
• Resulting in lack of timely response to customers, failure to meet SLAs and limited case visibility.
• Salespeople have a poor understanding of customer issues at an individual or company level.


We’ll analyze your specific needs and provide a customized implementation to overcome such challenges.

• Reviewing the business and developing a long-term plan for success by identifying the key problems.
• Moving from legacy software and systems.
• Staying on the cutting edge of Salesforce with our innovation.
• Providing a unified user experience by multi-app integration with a “single pane of glass” view.

Technologies used:

Salescloud had Empowered their sales team by leveraging best practices throughout the company.

Salesforce CPQ delivers fast and accurate proposals to boost sales and maximize company revenue.

Salesforce billing had seamlessly integrated the entire billing process with salesforce.

Conga Composer delivers sophisticated documents, presentations, and reports from Salesforce


Data Template’s approach is to recognize the ultimate goal and to ensure reliable operation for mission-critical applications across the teams. To enable, hassle-free services, we closely understand the business and provide a unique business solution that is simple, not complex.

Business Benefits:

Access to the entire historical data in a centralized system.

Ensured proposals are created accurately that can be approved quickly and transmitted to customer easily - all within the Salesforce CRM.

Automated the billing process to speed up payment collection and improve the cash flow.

Streamlined automated document generation and reporting save time and cost.

Enhanced Customer Satisfaction Using Standard Quality

Enhanced Customer Satisfaction Using ISO 9001:2015

ISO 9001:2015 has to enabled Data Template to achieve International recognition as a company which consistently delivers high-quality services with the use of emerging technologies.

Data Template has turn out to be more efficient and advantageous in numerous ways. Quality auditing is performed quarterly and continues to spotlight opportunities for process improvements. Post the audits the Non-conformities i.e. the deviations from the quality process are tracked down via traditional Root Cause approach thereby enabling the company to learn from the errors and perform the process in a standard way.

Quality Management Audits

Quality Management audits are structured reviews of the standard management activities like ISO 9001:2015 implementation process that facilitates & establish lessons to be learned, that may improve the performance on current or future project activities. These Quality audits are carried out by the company on itself to conform to management that their documented quality management system is working effectively.

This method of Quality Management System (QMS) also ensures to capture customer feedback. Feedback from customers is taken based on a defined frequency which is generally yearly basis. Based on the feedback received from the customers, the company tries to implement new strategies that will help in upholding the quality of service and also the customer relation.

The intention of quality audits is to examine how the project is using its internal processes like ISO implementation to produce the service it will release to the clients.

A Major goal is to discover ways to improve the techniques and processes that create the products and services. If issues are detected during the standard audits, corrective action is going to be necessary to the tools, processes, and procedures to guarantee quality is re-established.

API Test Automation with Python

API Test Automation with Python


The Client is a multi-sourced national property data warehouse that contains tax, deed, mortgage, foreclosure, environmental risk, natural hazard, health hazard, neighborhood characteristic, and property characteristic data for over 155 million U.S. properties, delivering actionable data to customers.


With the ever-growing APIs, our client constantly has to run regression testing on the massive set of APIs and test the new APIs. Their QA staff was overstretched, and they did not have time to develop new test cases for the new APIs.
Moreover, our Client provides services to various customers, and their customer often requests changes to how the API behaves. The client is responsive to these requests, but incorporating them requires additional testing.

Technologies Used:

MS Excel, pytest Framework, Python.

The Process

• Get familiar with the APIs.
• Create a pytest framework and prepare the scripts that will make assertions with the database and API response
• Transfer knowledge to the client.

Data Template Helped

We decided that automating API testing would help our client to run the regression tests efficiently and ease the QA workload.
In the beginning, we learned about their APIs, and we got an understanding of how to use their APIs in their processes.
Then we started automating their APIs by creating python scripts that will make assertions with the database and the API response. Manually comparing the API response fields with its database will be time-consuming as we need to compare multiple numbers of fields.
In the end, we handed over the framework and part of the automated regression test to our client.


• Automating the API response helped us to compare multiple API fields at one time only which is manually time-consuming.
• Helped us identifying issues much faster than ever before.
• Any new fields if newly added can be tested easily by only making changes in the scripts.

Selenium Python Automation

Selenium Python Automation


One of the main challenges that clients have faced during their application production releases is, certifying the application by completing the new feature testing along with manual regression testing within a short period of time. A number of manual test cases are more and its execution is a very time-consuming one. To overcome this difficulty, in every production release, the client has asked us to automate their testing process to manage the production releases very effectively.


We have proposed selenium framework which uses python scripting as a programming language to generate test scripts.

The components we used are robot framework, Customized Issue log tracking report within the automation tool and Automatic Email setup for sending a report to track test results.

The main advantages of using python are its faster execution time, its capability to run tests across different browsers and Supports parallel execution across browser. Also, Python is Open Source.


Using this automation framework and test scripts generated client can effectively manage their releases. Also, we were able to lower development costs for our client.

Achievement of WCAG-2.0

Achievement of WCAG-2.0


The Client is a multi-sourced national property data warehouse that contains tax, deed, mortgage, foreclosure, environmental risk, natural hazard, health hazard, neighborhood characteristic, and property characteristic data for over 155 million U.S. properties, delivering actionable data to customers.


Accessibility for the websites by the physically disabled people i.e. (blindness including low vision photo-sensitivity; deafness, hearing loss, learning disabilities and speech disabilities) with an ease.

Our Solution:

To have the accessibility of application for physically disabled people, we have taken WCAG 2.0 that was proposed by the Americans with Disabilities Act (ADA). Web Content Accessibility Guidelines (WCAG) 2.0 a technical guideline, which explains how to make web content more accessible to people with disabilities. Web content in this scenario generally refers to the information on a web page or web application, including text, images, sounds, code or markup that defines structure and presentation.

WCAG 2.0 has three levels of conformance, beginning with Level A (the minimum), Level AA, and finally Level AAA. WCAG 2.0 guidelines are organized under four principles: perceivable, operable, understandable and robust.

We have used the tools like “NVDA Screen recorder” and “Wave Tool”. Wave Tool helps to know the accurate areas of the application in which changes in the code need to be updated as per WCAG 2.0 guidelines (List A, List AA, List AAA). Whereas NVDA Screen recorder, it is the tool Blind people will be using for accessing the Web application. So we may make use of these tools for updating the code. Such that we will come to know if we have missed any content/images/sounds/links those are not pronounced in NVDA Screen reader tool.

Technology and Tools:

Web technologies: Net MVC framework, HTML, CSS styles, jQuery

Tools: NVDA Screen recorder and Wave Tool


A company that incorporates accessibility using the WCAG 2.0 guidelines ensures that their brand’s digital properties are globally accessible, and reduces the need for resources to meet geographic-specific standards. We were able to lower development costs for our client. They found the project provided a significant return on investment as well.

As responsive of the application is part of WCAG 2.0 guidelines; we have made responsive for an existing application and retard the separately developed Mobile application. Such that the client gets benefitted in-terms of Revenue, Usage (no of people accessed) of the application also to have one type of code for an entire application.

Achievement of Data Accuracy

Achievement of Data Accuracy


The Client is a US-based, Clothing, footwear and accessories provider for women through Stores and eCom in Retail sector. Here we are migrating the data from source system to the target system by using this data we generate reports. Customer team does analysis on top of the reports for further improvement in business.


Finding data mismatch or data swapping or unavailability of complete data or finding the null values in the Target system after loading data from the Source system to Target system.

Our Solution:

We can achieve an accuracy of data by comparing the Source system data with the Target system, Write a query to fetch data from Source and write an equivalent query to fetch data from Target

We executed above query sets in Source and Target system and we fetch the data and compare the result data in a tool like Excel. After comparison data and count should be the same in both the Source and Target system.

Technology and Tools:

Integration Tool: SSIS

Database: Oracle, SAP Hana Studio, SQL server 2017 and HIVE

Reporting Tool: SAP BO, Power BI, SSRS and MS-Excel


Since business users can quickly access critical data from a number of sources they can rapidly make informed decisions on key initiatives. Data and analytics have become indispensable to businesses to stay competitive. Businesses use reports, dashboards, and analytics tools to extract insights from their data, monitor business performance, and support decision making.

Providing Business Solution for Calculation of Availability of Services for a Leading US Based Network Telecom Company

Providing Business Solution for Calculation of Availability of Services for a Leading US Based Network Telecom Company


Since the client was in a situation that, ‘n’ number of users from their thousands of customer companies need to identify the availability of several types of services of a time point and a range of time in the form of availability percentage as well as in a graphical representation. As this implementation was a very critical challenge to the client organization, we had a detailed discussion with the client to understand the existing system and its present functionalities for coming up with an idea.

Our Solution:

An analysis is carried out first on what way of implementation can be approached by studying the present client system and their Data bus stream details, we decided to implement a framework and the output of the framework as different APIs which can be used for services to register their metric on data bus to be monitored for calculating a daily, weekly and monthly service availability number as well as to get it in a graphical representation.


•   Created a framework contains settings file for the environment setup and other scripts, framework infrastructure using cloud formation feature of AWS.

•  Execution of register service, availability calculation functionalities using AWS Lambda functions feature.

•   Analyzed and opted DynamoDB as a suitable database for storing the continues data in a well-defined table structure.

•  The extra addition of security features to the user APIs by integration AWS IAM authentication technique.

•   Provided modifiable settings and infrastructure build file which can be deployed in different AWS regions to get different API URLs for the different availability zones.

•   Added addition features as MTTR (Mean Time to Repair) feature which will help the user to identify the failure metric for a particular time period which is most helpful to identify the health of a service.

•   Implementation of MTBF (Mean time between failures), is to predict elapsed time between inherent failures of the service during normal system.

•   Integrated Grafana for displaying the availability graph for the registered services.

Technology and Tools:

•   Different AWS features like DynamoDB, Lambda, CloudWatch, S3 buckets.

•   Python Scripting.

•   Operating System: Linux


•  Provision of user-friendly APIs to the client for registering a service, calculation of availability percentage in hourly, daily and monthly basis.

•  Provided a very rich set of graphing option for matrices using Grafana

•  High accuracy in availability calculation of service health.

•  DB backup options up to one month.

•  High secured APIs with AWS IAM authentication.

•  MTTR and MTBF feature to check the frequent service health.

Bug Fixing for 9100 Tellabs Device of a Leading US Based Telecommunications Company

Bug Fixing for 9100 Tellabs Device of a Leading US Based Telecommunications Company


Since the network traffic is getting increased day by day the following problems will occur in ASN like Network traffic, unable to register the network, unable to manage the traffics using base station, unable to login to the device, DHCP is unable to assign IP to the subscribers etc.

Our Solution:

• Using R6 simulator we will reproduce the same scenario again to confirm the issue if the issue confirmed then immediately start debugging the code using GDB.

• Initially, our goal is to find the root cause for the issue, there might be two chances in all the cases.

i) One is a logical issue with ASN source code.

ii) The other one is the ASN configuration issue.

• Once we got a clear picture of the root cause then we will start fixing the issue according to that.

• We have enhanced the feature of R6 simulator, from single network entry to multiple network entry with R6 simulator.

• We have done the fix for DHCP pool load balancing and renew subscriber to get old IP if its lease time did not get over and service type is same.

How DT delivered business value:

•  Identified defects as soon as possible and deploy the code into the ASN.

•  Fixing the issues consistently and made the ASN device more stable.

•  Effectively improved the ASN performance.

•  Identified 60 percent real-time defects and fixed before it gets affects the client execution.

•  Optimized the code and frequent memory clearance for ASN.

Technology and Tools:

Scripting languages: Python, and Shell.

Programming language: C.

Operating System: Linux.

Tools: R6-simulator.

End-to-End Test Automation for a Cloud Computing Application of a Leading US Based Telecommunications Company

End-to-End Test Automation for a Cloud Computing Application of a Leading US Based Telecommunications Company


Since the new feature’s development, enhancement activities have become more for the application, the client wanted to reach hundreds of test coverages as part of regression testing to make sure the new implementations have not affected the existing features. Later, realizing the manual verification of end to end features in a daily basis goal seems difficult as the client could not provide the feature deliverables to their end users. This curtailed the client’s ability to meet their expected delivery of verified features as the new implementations started affecting the existing features.

Our QA consultants undertook an assessment initiative to identify the current automation challenges and design the optimal solution for the client’s needs.

Our Solution:

At the beginning of the engagement, we were required to address these issues:

• Overcome high execution cycle time, which was resulting in timeout issues and execution failures.

• Fixing the inaccurate result reporting in the existing scripts, which led to low confidence in automation.

• API as well as different database testing which posed more challenges in achieving efficient automation coverage.

To overcome the above challenges, we designed and implemented a feasible test strategy:

• Implemented ‘TestNG’ framework-based test automation approach using Selenium which used with the scripting languages like Java, Python, Perl etc.

• Introduced the idea of cleaning up of the test environment/application and installation of a new environment with all the latest feature commits and execution of test cases using Jenkins tool by creating multiple jobs.

•  Performed risk-based test case analysis to optimize the number of test cases executed.

•  Implemented multilayer expert test reports with server logs attachments and issue root causes.

•  Email summaries and web dashboards to visualize the graphical representation of test suite executions, thus increasing the confidence in the automated script and easy troubleshooting.

How DT delivered business value:

•  Identified defects early in by implementing early automation.

•   Automation became highly scalable, easy to maintain, and considerably reduced execution time and cost.

•  Identified 80 percent more defects before go-live due to early automation strategy.

•  Reduced test cycle time by 50 percent by code optimization and frequent memory clearance through automation scripts.

Technology and Tools:

Scripting languages: Java, JS, Python, Perl, Ruby, C#, Shell.

Tools: Selenium Webdriver, Appium, Jenkins, CircleCI, Ansible Playbook.

Achieving Security and Localization

Achieving Security and Localization


The Client needed a local search engine website for the Companies and Classifieds. This search engine should provide the product details of the Companies and Classifieds registered here.

Challenging Situation:

As the project scaled up, it became increasingly difficult not only to gather all needed information but to execute as soon as possible. We knew that it needed a better Framework to make it secure enough for the people, who really want to trust And the project needed localization for every international user. In addition, the project was not well-versed in best practices and did not have the skills to either kick the project off or bring it to the next phase.

Our Solution:

We hosted a planning session and decided to kick off the project with the open-source PHP framework i.e. Laravel. Learning a new framework is daunting, but it's also exciting. To develop the website localized we decided to develop the project with keywords in place of every text.

We created a file with

• keywords of the text

• Language Required

We linked the file in the environment section to get those words, related to the keyword that we are calling. After changing the language itself we were able to get the localized website. Such that in place of creating the projects in different languages, we created one project and used keywords for texts and according to the requirement we are fetching the translated texts.

Technology Used:

Open Source PHP framework Laravel, MySql


The decision to use this framework took the pain out of development by easing common tasks used in the project, such as authentication, routing, sessions, and caching. Localization helps the project to be international with less effort.

Infrastructure Application Setup Through DevOps and AWS

Infrastructure Application Setup Through DevOps and AWS


The Client is a New York City-based technology company that provides a SaaS learning platform to improve and advance employee and organizational performance.The company is the leader in Microlearning, a modern, effective approach to workplace learning that offers employee’s single-concept, mixed-media lessons within the flow of their work.

Challenging Situation:

Customer reported their one of large application has been deployed on the Oracle virtual machines (OVM) windows and Linux machines and they told they are very difficult to manage those instances and if one of machine crashes they have to rebuild the servers from the scratch and it requires to put the lot of efforts to understand how to build the complete environment and also if they want to restart the OVM instances it requires a lot of downtimes but customer told they want the less downtime their application during such instance restarts. They requested how to expose their applications on the cloud infrastructure in the automated fashion and stable their application.

Our Solution:

We Cloud Application Support/DevOps support team proposed and implemented and deploy their applications into the Amazon web services infrastructure and also used three-tier architecture Configuration management tool (chef) to automate the application software installations on the AWS instances. We have also implemented containerization micro service (Docker) applications to stable the customer applications. For the deployment perspective, we have implemented the Continuous integration & continuous deployments for the application deployments.

In the AWS Infrastructure level we have created the instances/servers through the AWS auto-scaling group, If the instance is in the auto scaling group whenever the application load is high the instance/server size will increase automatically and once the application load is down the instances/server size will be reduced in the automation fashion so it's required a manual effort to increase the servers count it gives a high scalability of the applications. We have also added the customer application deployed instances into the Elastic load balancer (ELB) to distribute the application load into the multiple instances/servers.

We have also used Infrastructure As A Code (IAAC) tools i.e Terraform to build the Amazon Web services Infrastructure through the terraform code.

Technology Used:

Amazon Web Services (EC2, ASG, ELB, Route53, S3, VPC, IAM), DevOps (Chef,Docker, Git, Jenkins/Circle CI, Marathon/mesos, Datadog)


After customer applications moved into the cloud infrastructure they are very happy and their applications are very stable and there are very less minimal downtimes and through the devOps tools infrastructure deployments are happening through the automated fashion when code commits happens in the repository level and there is very less minimal manual effort required during the each and every deployments process.

Charlotte Russe (CR) Data Migration & Reporting


Charlotte Russe is women’s clothing Retail Company & which is engaged with Data Template Info Tech to upgrade/move their existing data warehouse and Reporting systems (such as Oracle, SAP HANA) to Cloud systems (Azure Data Warehouse, HDInsight Blob storage, PowerBI and SQL Server Reporting Server). The new system should overcome the below problems of existing system

Our Solution:

Data Migration

Designed an ETL process for the data migration to load the data from different source systems such as Oracle, SQL Server, text and excel file & XML etc to Azure Data warehouse. Each step in the ETL process is grouped/organized (Source Data Population, Dimension Table Population & Fact Table Population) to avoid the intermediate table duplication and hitting the source systems multiple times.


Designed the reports in new system using Microsoft reporting technology such PowerBI & SSRS with the updated report functionality and layout as some additional features are available in new environment & not supported SAP BO features are designed with alternate features. The technology is decided based on the user expectation on the report layout. For dashboards with graphical data and table data was done in PowrBI. Reports with tabular and matrix reports are majorly designed in SSRS


Data Migration

•   As much as multiple existing SAP BO reports need to be combined into single PowerBi/SSRS report as per the report data.

•   Both data warehouse systems have different table schema definition and data. We must analyze and merge the existing systems effectively into a single Data Warehouse system

•  There are duplicate tables maintained in both the warehouse and need to identify the duplicate table and design one based on the uses and reporting need.

•   There were no requirement documents shared by team. Instead need to analyze the existing SAP BODS job to understand the migration logic. There can be too many intermediate stage tables are used & it should be avoided if possible, in the proposed system for efficient and better design.

•   New system performance should be faster than existing system.


•   As much as multiple existing SAP BO reports need to be combined into single PowerBi/SSRS report as per the report data.

•   There are new complex reports requirement which was doing manually earlier by CR business user in support with tech team. This need to be automated & designed the report in the new system

•   As per requirement, we need to follow SAP BO data model to get the report logic. Noticed too many measures are created with same logic in the different models - this should be resolved in new system & need to create a global model for the reporting.

•   Reports from the new system should have better performance compared to existing system.

•   There was complex business logic for each report requirement.

•   There was requirement to restrict the report display based on the login user. This was a new requirement and earlier was handled manually.

•   Most reports were multiple-level, drill-down reports.

Technology and Tools:

•   Oracle,SQL,SQL Azure DW,HDInsight Blob Storage,Files- XML, XLS, XLSX, CSV, TXT

•   Deployment Service:Power BI Service

•   Development Tool: SSDT,Power BI Desktop


The new data warehouse systems is designed & delivered to client within low budget & defined timeline. All the client faced issues are resolved and client can shut down their existing Data warehouse such as Oracle and SAP HANA. Some of the features of the new system is below:

•   Tables are well organized for easy access.

•   Accuracy of data is good compared to old systems.

•   Daily data refresh is more effective compared to old system.

•   Removed the duplicate tables.

•   All the designed reports are organized into the different report folders/workspaces for easy access & delivered to client.

mPower Heart e-Suit for Patient–Centered Chronic Diseases Care through Clinical Decision Support, Big-data Analytics and Machine Learning

mPower Heart e-Suit for Patient–Centred Chronic Diseases Care through Clinical Decision Support, Big-data Analytics and Machine Learning


Chronic disease such as hypertension, diabetes, dyslipidemia, chronic obstructive pulmonary disease/asthma, etc is major challenges to health systems worldwide. Despite recognition of the benefits associated with early detection and treating chronic diseases and its risk factors, control rates remain suboptimal. Personalized medicine, the treatment process that is tailored to the individual needs of each patient, is recently gaining increasing attention for its prospect in the development of effective chronic disease treatment regimens.

We have developed a software application, namely mPower Heart e-Suit, to facilitate evidence-based, personalized medicine practice by doctors. It is designed to help doctors in computing clinical management plan in accordance with the latest clinical management guidelines/protocols. In addition, the mPower Heart e-Suit helps in standardizing care and setting benchmarks across health facilities.

The e-Suit as the following features such as:

• Computing clinical risk scores;
• Electronic storage of patient data for life-long follow-up;
• Clinical Decision Support Feature – i.e. generating a personalized management plan for patients by computing through complex clinical management algorithms for suggesting optimal drugs, their dosage and warning contra-indications, etc. The Clinical Decision Support feature makes it unique, which is missing in electronic health record systems (EHR) available in the market.

The scientific development of the mPower Heart e-Suit has been published in the Journal of the American Heart Association. Currently two state governments – Tripura and Mizoram - have adopted the mPower Heart e-Suit covering 56 government primary care facilities. The system is also helping in healthcare planning by assessing the cost of chronic disease programs, workforce management, planning logistics etc.

The mPower Heart e-Suit has won several accolades. It has been presented at the Royal Academy of Engineering, held at Oxford Union, UK during 18-20 Sep 2017. Furthermore, the World Health Organization has selected us to develop an App for promoting the use of WHO PEN Package in the clinical management of non-communicable diseases using the mPower Heart e-Suit platform. The new variant - mPEN App – is currently being piloted in Maldives which is likely to be promoted by the WHO in the South – East Asian Countries

mPower Heart Project:

Results from Himachal Pradesh

Impact on Process of Care

•   Access to evidence-based guidelines

•   Task shifting

•   Less paperwork

•   Self-confidence & knowledge

•   Access to patient records

•   Standardized care

•   Better Follow-ups

•   Lifestyle advice

•   Assurance of providing best health care at Primary Health Care facilities

•   Real-time, Reliable, Accurate, and Secure data

mPower Heart Project: Tripura

•  Funded by the NHM and State-wide implementation

•   40 hospitals: 2 PHCs, 19 CHCs, 12 sub-district hospitals, 6 district hospitals & 1 State Hospital

•  Technical Coordination Cum Support Team:

•  State Health Officials and Project team
•  Software updates
•  Operational updates

•   Dedicated website for project:

•  Server Access for state government

•  Patients enrolled/benefited: ~70,000

mPower Heart Project: Mizoram

•  Adopted for state-wide implementation in Tripura

•   16 hospitals: 5 CHCs, 3 sub-district hospitals & 8 district hospitals

•  Server Access for state government

•  Patients enrolled/benefited: ~8000

Recognition and Awareness

•  ICMR selected mPower Heart System as a showpiece for the ‘Exhibition on Innovations in Medical Science & Biotechnology’ at the Rashtrapati Bhavan in 2015

•   “mPower Heart Model for NCD” selected for oral presentation at Fourth National Summit on Good, Replicable Practices & Innovations in Public Health Care Systems in India, Indore, Madhya Pradesh 6th July to 8th July 2017

•  Presented at the ‘Frontiers of Engineering for Development’, organized by the Royal Academy of Engineering, at the Oxford Union during 18-20 Sept 2017


•  Clinical Decision Support Software for management of NCDs as per WHO PEN Package

•   Powered by mPower Heart e-Suit

•  Being piloted in the Maldives

•  With the Support from WHO-SEARO

Adopted by Government of India

•  Ayushman Bharat Program (National Health Protection Program) – Largest ever Health Insurance Program covering 500 million population

•  Dell-EMC developing a Software Application for the Program with funding from TATA Trust

•  The Government of India is integrating mPower Heart e-Suit with the Software Application as the clinical decision support module

Courtesy: Public Health Foundation of India & Centre For Chronic Disease Control