Table of Content
-
Product
-
Customer Education
-
Customer Success
-
Marketing
-
Professional Services
-
Data Science
-
-
Finance
-
Accounting
-
Credit Risk
-
Revenue Operations
-
Sales Operations
-
-
Healthcare
-
Insurance
Product Analytics - Customer Education
Project | 01
Time-to-Value (TTV) First Study/Test Launched
The customer education team's objective was to prove to senior leadership that customers were enrolling and completing UserTesting (UT) university courses prior to using our platform. The problem was that they didn't know how to analyze their data, and they didn't have any data stored in our data warehouse because data were stored in a third-party vendor.
As their Product Owner, I did the following to address business problems. I provided deep-dive analyses and methodologies with various well-defined product metrics to discuss the success of each metric and formulate strategies to improve customer experience on our customer education team.
1. Designed their product analytics roadmap and vision
2. Implemented the SDLC to build iterations of product requirement documents (PRDs) on measuring the ever-evolving product metrics success.
a) Developed time-to-value (TTV) product metrics creation and evaluated metrics' success (i.e. from the subscription start date, when the customer launched a first test/study)
b) Propose iterations of improved TTV product metrics to ask better questions
3. Build data pipelines via Fivetrans and DBT, including refactoring codes to the data warehouse (Snowflake). Continued to bring in more data and create more metrics for version upgrades using this framework.
4. Build versions (v2, v3, v3.1, v4) of customer metrics for three Tableau dashboards (First Study/Test Launched, First Results Consumed and First Observations Created).
5. Presented solutions based on analyses, recommendations and strategies for each version.
6. For the upgrades including enhanced product metrics (address better and in-depth questions because data continued to flow in and improved), dashboard designs, story-telling, and reporting to Customer Education's senior leadership and other C-suite executives.
Consistently trying to ask better and more in-depth questions and anticipated other business questions that were related to UT university by leveraging well-defined product metrics measurements.
Codes: here
Tool: SQL, Tableau, Snowflake, DBT, Github
Program Management: Jira, Confluence
Product Requirements Document (PRD): Customer Metrics
Project | 02
Skilljar and Docebo QAU/MAU
After the success of customer metrics v2 for customer education, they wanted to further look at UT university courses' monthly and quarterly active user (QAU/MAU), similar to subscription users at Netflix, trends as part of customer metrics v3. However, there was a university system change from Skilljar to Docebo. The objectives were to combine two systems - Skilljar and Docebo to look at QAU/MAU to see growth in both of the system.
I then have to identify two data sources and build ELT data pipelines from two systems to merge into one system and produced usage rate insights. This is a compliment dashboard to the First Study Launched, First Results Consumed and First Observation Created because we could look at if our university enrollments have increased over time so that we could see spikes in first study/test, results consumed and observations created.
Tool: SQL, Tableau, Snowflake, DBT, Github
Program Management: Jira, Confluence
Product Analytics - Customer Success / Marketing
Project | 03
Salesforce Campaign
The Program Management and Customer Success team hosted various Salesforce Marketing virtual campaign events for customers every month. The business problems were that they wanted to know the product usage rate on launching tests/studies and sessions prior to and post-marketing campaign events. Additionally, their data sat on salesforce but they weren't in our data warehouse.
To produce quantitative analyses to look at product usage rates, I built ELT data pipelines to massage the data and built product metrics to look at total campaign participants and their perspective account usage rates (monthly total and six months averages) before and after marketing campaign events.
I then presented insights and recommendations to Programming Management and Customer Success teams on why we saw some seasonality in data, outliers and spikes, and we could do in the next phase to boost the product usage rate by hosting various virtual campaign events throughout different slots in the month.
Codes: here
Tool: SQL, Tableau, Snowflake, DBT, Github
Program Management: Jira, Confluence
Product Analytics - Marketing
Project | 04
Marketing Top of the Funnel (ToFu) Data and Dashboard Migration to Snowflake
The Marketing team needed to migrate their data sources from Periscope (AWS Redshift) to Tableau (Snowflake) because we retired AWS Redshift and Periscope. Therefore, it posed a few challenges. First, it was data migration from Redshift to Snowflake. Second, it was to build an understanding of multiple marketing data sources. Third, it was to reduce the original codes (600+ lines) and to refactor to efficient codes (~400 lines) that kept the same data structure.
The solutions were I built a deck and presented it to senior management to address business problems, current stages, steps towards migration, advantages of refactoring, and timeline on completion.
By doing this migration, it created a backbone of writing efficient codes that would help speed up data ingestion and load times and reduce data storage.
Deck: here
Tool: SQL, Snowflake, DBT, Github
Program Management: Jira, Confluence, PowerPoint
Product Analytics - Professional Services
Project | 05
Time-to-Value (TTV) First Study Launched and Observation Created for Implementation Service (IS) Users and Non-IS Users' Behaviors
The Professional Services team saw the success of the customer metrics v2, v3 and v4, that I built iterations of product metrics enhancements and PRDs. Their team's offers implementation services (add-on services) from customers' subscriptions. Their objectives were to know and understand the product usage before implementation and after implementation for the first test/study launched and the first observations created (above section in customer education). They wanted to see the comparison between the two groups' usage. Eventually, they wanted to see that customers came to professional services and launched tests/studies as soon as possible as opposed to subscription customers that didn't subscribe to implementation services.
The challenges were that implementation services data weren't available since their data were standalone in other vendors. I had to work with the enterprise system and data warehouse team to bring the data into Snowflake. Second, they wanted additional filters where those data were scattered among the organization. Therefore, I had to massage the data until it was clean enough for metrics calculations and visualizations.
The results came just as professional services had hoped for, in which the majority of implementation services customers launched tests/studies and created observations within 30 days of their subscription after taking implementation services. Now, they firmly believe that this behavior will continue as long as they improve their curriculum.
Codes: here
Tool: SQL, Snowflake, DBT, Github
Program Management: Jira, Confluence
Product Analytics - Product / Data Science
Project | 06
Customer Integration - Slack
The Product and Data Science teams wanted to build customer integration usage rates including Jira, Outlook / Google Calendar, Adobe XD, Qualtrics and Slack. Each of the integration had different ways of tracking events. Some integrations like Adobe XD have two events (turn on and off integration), as opposed to Slack integration which had ten events for tracking usage. Therefore, we picked Slack integration as our starting point and eventually moved to other well-defined integration.
To identify which events were meaningful to track, I then documented how I installed Slack integration on our platform to share videos in Slack. I wrote a step-by-step guide on how to install, turn on notifications, and shared videos in Slack. While I built the step-by-step guide, I also learned how Slack events were tracked on the backend. After that, I picked the meaningful events that were particularly useful when we look at the Slack integration usage rate per month.
From the analyses, we learned that there were two ways to share videos on the UT platform. First, it was to share during the Slack plug-in in the UT platform. Second, it was to copy and paste the video link and share it directly in the Slack channel itself. Apparently, customers used heavily the Slack plug-in rather than the copy-and-paste option. I provided a recommendation of retiring the copy-and-paste option and made customers to share their videos using the Slack plug-in so that we could further enhance and beautify the plug-in.
Programming Language: Python
Tool: SQL, Snowflake, DBT, Github, Word
Program Management: Jira, Confluence, PowerPoint
Finance Analytics - Accounting
Project | 07
Account Receivable (AR) Billings Invoices and Memos
The Accounting team wanted to visualize trends per month on account receivable (AR) billings invoices and memos. Additionally, the numbers must match the Oracle Netsuite system and Finance excel spreadsheets. They wanted to be automated rather than going to Oracle NetSuite front end to download reports every month.
Therefore, I had to work with Finance System team and data warehouse team to bring NetSuite Schema into Snowflake. Then, I had to make sure the ELT data pipelines flowed in correctly with matching Accounting Excel numbers.
In addition to building an automated pipeline from NetSuite to Snowflake, I went beyond and provided more meaningful insights by producing an aging schedule on Tableau, which is vital for the Accounting team. The Tableau version of the aging schedule mirrors what the Accounting team downloaded the aging schedule off of Oracle NetSuite. The Accounting team gave great feedback on added value deliverables:
1. Automated data pipeline from Oracle NetSuite to Snowflake
2. Produced AR billing invoices and memos trends and pivot tables by accounts and parent accounts per month on Tableau
3. Bonus: Aging schedule on Tableau
Software: Oracle NetSuite
Tool: SQL, Snowflake, DBT, Github, Word
Program Management: Jira, Confluence, PowerPoint
Finance Analytics - Credit Risks
Project | 08
Automation - Batch Update Providers Info in MEDITECH Expanse with VBA and Database
The Credit department wanted to build automated dashboards/reports that showed if the bank was performing, underperforming, or taking on too much or too little risks.
To address those business problems, I helped to build the overall bank's portfolio using statistical analyses, including credit portfolio utilization rate, expected loss on risky loans (7,8 and 9), net charge-offs, delinquency (1-29 days past dues, 30-59 days past due, etc.) and weight risk ratings.
By utilizing the above statistical analyses, I built trends, tables, and rates, and broke them down by customer sectors, bank departments, and credit risk-rated loans.
These credit metrics were included in the executive reports because senior management needed to know if the bank was performing, under performing, taking on too much or too little risks in the past and determined what needed to happen in the future to keep up generating profits for shareholders.
Programming Language: VBA
Software: MEDITECH Expanse, iPeople Script Director, MEDITECH Client/Server
Tool: Access, Word
Finance Analytics - Revenue Operations
Project | 09
NetSuite Subscription Revenue QoQ - Growth and Retention Rates for Existing, Large (>100k) and Churn Customers
The CFO demanded that our company move away from Excel financial reporting. He appointed the Finance department including Accounting and Revenue Operations teams to migrate all of the Finance reporting to NetSuite and Tableau. The business problems were the two systems didn't align. Second, Finance is the source of truth, but the CFO wanted to move the source of truth to NetSuite. Third, the Finance department had been reporting strictly numbers in Excel. However, the CFO wanted visualizations of our subscription revenue including growth and retention rates on new, existing and churn customers.
To counter the problems, our methodologies were:
1. We produced two dashboards - A Finance source of truth and a NetSuite-based source of truth.
a) The intention was to look at the Finance source of truth and compare with NetSuite-based source of truth to look for data discrepancies.
2. We started reconciling NetSuite-based revenue against the Finance source of truth by using Python to do data cleaning and processing and Google sheets (sumifs, index and match, etc) to find discrepancies per accounts, contracts and revenues.
3. Once we found the discrepancies in the NetSuite system, they were mostly:
a) company name changes (i.e. Facebook to Meta)
b) multiple IDs with the same company (i.e. a123 and b234 are both Facebook)
c) company acquired by another company (i.e. Tableau acquired by Salesforce - two different IDs to one ID)
4. Documented all of the discrepancies and sent our recommendations to Accounting and Revenue Ops to get approved. Then, we sent documentation to the Enterprise System to restamp opportunities, remove duplicate IDs, and update acquired companies' IDs.
5. It took us two-quarters of reconciliation works to make NetSuite to match the Finance source of truth.
6. Therefore, we provided a vision and road map going forward to prevent reconciliation because they were all manual work and time-consuming.
We proposed a framework - when we had gotten new opportunities or made any changes to current opportunities, the deal desk team had to notify us first before they entered new or updated info because our automated NetSuite system needed to know what the changes were so we could take that into considerations and made changes on the NetSuite system. Otherwise, if the deal desk team didn't notify us in advance, we would have to do reconciliation for every single quarter and that wouldn't be an automated system.
Programming Language: Python
Software: Oracle NetSuite
Tool: Google Sheets, Excel, SQL, Snowflake, DBT, Github, Word, PowerPoint
Program Management: Jira, Confluence, PowerPoint
Finance Analytics - Sales Operations
Project | 10
Clari Sales Forecast Readout
The Sales Operation had to create an Excel monthly and quarterly Sales Forecast Readout. The task was repetitive over time. Therefore, their team contacted us to help them reduce repetitiveness by 100%. To do that, I had to work with Clari and the data warehouse team to get their API to bring the correct schema and data to Snowflake.
I was tasked to completely transform their Excel by giving them a Tableau report with automated sales forecast numbers monthly and quarterly so that they won't have to recreate the report over and over again.
The results were astonishing because the Sales Operation team came to download this report every month and quarter and saved them 100% to recreate the same report.
Tool: SQL, Snowflake, DBT, Github, Word, PowerPoint
Healthcare Analytics
Project | 11
Automation - Batch Update Providers Info in MEDITECH Expanse with VBA and Access Database
The Healthcare industry is undergoing a major transformation by leveraging. One of the business problems are the way the Healthcare industry stored providers' information (physicians, nurses, etc), and maintained their data when they changed their info, such as credentials, facilities, address, roles, phone numbers, etc. Currently, Fraser Health is undergoing eHR transformation from one system to another - MEDITECH Client Server to MEDITECH Expanse. In this case, some data are lost in the migration such as updating providers' info.
1. I wrote:
a) a business requirement document (BRD) to document all of the deliverables
b) step-by-step guide
c) A workflow on user account provisioning from e-connect to provisioning to Meditech to iPeople
2. I helped Fraser Health to build a production VBA script and a database to streamline batch update providers' info in iPeople Script Director to connect with MEDITECH Expanse and Access database to navigate MEDITECH Expanse menus. Other than that, the script also helped to log successful and error statuses and where the reasons were when doing batch update providers' info.
The results were amazing. The senior leadership enjoyed my demo and showed what the possibilities were to automate eHR information at a large scale. The production scripts helped two hospitals - Ealge Ridge Hospital and Royal Columbian Hospital to batch update providers' info. I also proposed we could have reused the script for other hospitals that were upgrading from MEDITECH Client Server to MEDITECH Expanse with minor tweaks.
BRD: here
Programming Language: VBA
Software: MEDITECH Client Server, MEDITECH Expanse
Tool: Access, Word
Insurance Analytics
Project | 12
Claims Text Search Dashboard - Specialty Markets
For years, the Specialty Markets had claims coming in and out. Yet, they couldn't summarize general questions on their claims such as how much we paid customers and which categories we paid the most. The objectives were to answer general questions like these and have clear visuals on the type of claims and their amount.
Therefore, there was a dire need to have a searchable dashboard for claims experts that could work as a Google search to look up specific claims like slip and fall, and hurricane, etc in summary memos, FNOLs, loss adjusters reports and more. Before I built claims text-searching, I had to build an understanding to cross teams because they had to be on board with what we could deliver and what the steps were to get to where we wanted.
1. I had to build BRD and lay out what the possibilities were.
2. I needed cross-teams help and knowledge to build an ETL data pipeline using VBA from structured data like emails, pdfs, summary memos and others to the Hive database.
3. After that, I had to build multiple SQL queries to pull data from Hive to PowerBI and built a searchable (w/ search bar) claims text dashboard
4. Before I gave the demo to a board audience, I asked for claims experts to test the dashboard on the functionalities, views and analyses.
5. Once I got their feedback and made changes, I presented a PowerPoint including a live Demo to 50 technical and non-technical stakeholders for current stages, solutions and recommendations for next phase.
Programming Language: Python
Codes: here
Tool: VBA, Excel, SQL, PowerPoint
To see more or discuss possible work let's talk >>
Follow me
© 2024 by James Kong
Call
C: 604-313-6221
Contact