Legacy Web Application Re-engineering Using NodeJS for a Large Credit Bureau Agency

Hemesh Thakkar

One of the largest credit bureau agencies in the US, generating close to a million confidential credit summary reports every day, approached us with issues in product performance, code maintainability, design problems, poor user experience and outdated legacy frameworks, resulting in a maintenance nightmare due to coding complexities.

As custodians of critical customer data, the financial services company had to adhere to many stringent security measures that had limited their scope of introducing many modern technologies and tools in their web application - a perfect situation for AccionLabs to get involved and do what is best fit for the product re-engineering efforts at the company.

All our efforts - in-depth deep dive sessions, proof-of-concept, and solution re-engineering - were done keeping in mind the existing legacy framework. This case study is an example of how Node.js with proven flexibility features can be innovatively used as an API gateway.

Generating more than a million credit reports every day

The company works with many partners and associates that provide customer data points to create customized credit summaries and credit scores for more than a million customers every day. It provides credit reports, flood zone reports, and mortgage reports to the customers. It also aggregates reports from multiple vendors. Developed by the company more than a decade ago, the B2B web application has credit reporting services as the core business model and credit reporting platform as their main product.

The unique feature of the application is that it is easily customizable based on requirement requests from partners and customers. This resulted in several change requests that had been accommodated for over the years. As custodians of sensitive financial data on individuals, data security was their paramount concern as the report mentions the credit card details, SSN number and transaction details that could not be compromised at any cost. This was another reason that they had kept away from any re-engineering effort.

Monolithic tightly coupled architecture

The technology they were using had a monolithic architecture, and their data and presentation layer was tightly bundled as a single application. They used an ancient technology called Java servlets. This made it difficult for the company to find engineers who could work and improvise on the current technology framework. In addition, the features and customizations done for each and every client for more than a decade had made the codebase extremely complex. They had to support old browsers as the company had no idea on the type and version of browsers being used by their partners. Also, even if they were aware, the partners were using old browsers on purpose, controlled by network admins, for security reasons. Hence, legacy browsers such as IE8 had to be supported as well.

Another major challenge was that the application was insecure as per current technology standards and it did not have the capabilities to support any major modern day attacks. Therefore, we had to help the client move forward to a newer technology base that supported modern defense mechanisms. Another major issue was the project execution methodology - they had implemented the waterfall model of software development, instead of the agile methodology in projects that we use (we prefer it because of its action-oriented nature). Therefore, AccionLabs had to completely revolutionize the way they worked on software projects.

The absence of any documentation for web applications made it trickier. Only a few business analysts knew the product well, but the information from that source was limited. Finally, we had to go back to the code and figure out how the platform performed.

Microservices in action

Once we stepped in, we introduced the REST-based service-oriented architecture. Their existing framework was broken down into two layers - a business layer and a presentation layer as two different entities instead of a single bundled-up application. The business layer would just expose REST APIs that returned JSON and the presentation layer, which would be a separate entity and would only deal with rendering data in an HTML format. As a solution, we additionally introduced Node.js as an API gateway and server-side rendering engine - this is a unique concept of using Node.js as an API gateway. This is how it works: The API gateway gets called by the front-end that aggregates the JSON responses from granular backend REST services. It also determines browser capabilities and accordingly either responds with JSON or server-side rendered HTML.

text

We enhanced the user experience with JQuery and Jquery UI, along with other plugins. We could not use modern day frameworks because of the browser compatibility requirements. We also enhanced security features using modern day frameworks. This helped us plug the risk of major security attacks such as XSS attacks, cross-site scripting attacks, SQL injection attacks, csrf attacks, session hijacking, along with many others. Therefore, the application became much more secure than the older version.

Modern technology stack

In the tech stack, on the UI side, we used HTML5, CSS3, Handlebars JS as our template rendering engine; the javascript library included JQuery and JQuery UI.

On the business side, we used Java, Spring MVC Framework, Spring Security, and on a separate server, we also used Node.js that uses Express JS as its REST API framework and their allied process managers and build tools. The client’s database was left unchanged apart from a few fields that were added. Their infrastructure was on premise as data security has been of high priority for their business - this was one of the key reasons they had been avoiding moving to the cloud. On the DevOps side, we used Maven as a build tool, SVN as a version management tool, Jasmine and JUnit for unit testing, Jenkins and RunDeck for CICD pipeline and an F5 load balancer and Splunk for log management. For project and defect management, we used JIRA.

text

Using Node.js Server as API gateway Node.js is efficient at managing services that require a lot of throughput and bandwidth. Therefore, they can be used for core mission-critical backend services. The Node.js server makes the same request to the application server whenever a request is made to it. As part of doing that, it also aggregates the APIs. Hence, if the UI needs a page to be loaded, it will call only one REST endpoint on Node.js and the Node.js can be used to call multiple granular backend service APIs. So, it acts as an API gateway. Simultaneously, Node.js is also cognizant of the browser features that are being supported. It determines the capabilities of the browser - simply put, whether the browser can handle the modern day frameworks or not. If IE8 or IE9 was being deployed, it would switch to server-side rendering. Additionally, we created UI templates in handlebars. The rendering process involved handlebars, which are UI templates, combined with JSON (data) with HTML output.

Keen on modernizing your web applications? Get in touch with our experts, and we can get you running on microservices with our Node.js and REST API frameworks quickly.

By clicking “Accept all cookies,” you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. Privacy policy

Contact Us