The Collection Team at Newport primarily leveraged their EMR to complete their daily tasks. Since their EMR could not meet the business need, our solution would fill this need and act as an accelerator.
When working with third-party EMRs, our first challenge is accessing data and determining how it impacts our custom solution.
Is there an API we can work with? Do we have direct access to the database? Can the EMR handle a measurable increase in volume? We needed to answer these questions and determine a low-maintenance solution. Here’s how we approached it.
We needed to own the data by establishing and maintaining a database separate from the EMR where we controlled the data model. Having a data model tailored to our specific needs allows us to focus on what pieces of information were necessary for presentation and reporting.
When owning data from a third-party system, we also have to account for updates made within that system. Additionally, we need to be strategic around how frequently we update data and what triggers those updates.
With that in mind, we formulated a process leveraging Azure Functions and Azure Data Factories to facilitate our data needs.
We started setting up an Azure Function on a Timer Trigger to make nightly requests to the ERM. The ERM would respond with a JSON payload containing records updated within the past 24-hours.
Within the Azure Function triggers, we take that JSON payload and create one CSV file per logical entity before dropping the files onto an Azure Storage instance.
At this point, a Storage Event Trigger defined in the Azure Data Factory fires to begin processing the CSV files, transforming records before saving them to our Azure SQL Server data warehouse.
When using Azure services, we need to secure their configuration information; this is where Azure Key Vault comes into play. Azure Key Vault safely stores configuration information that services within the solutions architecture can access.
Lastly, our solution incorporates Azure Monitor which helps the business and development team see performance metrics. We provide these metrics to each team through Azure Monitor custom dashboards. These dashboards keep the business and development teams proactive throughout their day.
Having designed the data management process, we could work towards providing the Collection Team with a companion web application to aid in their day-to-day processing. The new application would provide several new features.
– Forecast claims balances, expected revenue, and the total number of claims within a period.
– A work queue for processing claims within a period.
– Actions users can perform on a claim to prioritize when it gets worked.
– Define rules for assigning claims to users.
Here’s our approach to providing this functionality.
Our setup starts with two Azure App Services we deploy our Angular application and .NET Core API to. This is where Collector’s Workbench will be available to the Collection Team and where we use their Azure Active Directory to authenticate them.
To help make assignment and re-assignment processing easier, we used an Azure Function. This allows us to fire off an asynchronous process when a user in the system changes assignment rules or when there’s an update from the ETL process.
Just like the ETL process, our services request their configuration information from the Azure Key Vault and provide data that feeds into Azure Monitor to power the team’s custom dashboards.
As mentioned before, one of the goals was to create a low-maintenance solution. To that end, we used Azure Biceps to help manage the resources we needed to be created in each of our environments. This means that as our architecture changes, deploying those changes across environments is a simple process with minimal manual effort.