resume Flashcards
(12 cards)
Developed REST APIs for the Anti-Money Laundering product (C#, .Net Core, MongoDb), increasing data
exposure by ~20%, enhancing the compliance team’s ability to perform detailed checks.
- I noticed that certain third-party endpoints that we use provided more data than we were actually using.
- I was tasked with updating our APIs to ingest the previously unconsumed data and integrate it into our system
- Did this by configuring all of the APIs and data models the C# to hold the additional data.
“Identified and leveraged additional data from third-party endpoints by updating REST APIs to ingest previously unutilized information. Configured C# APIs and data models to integrate this new data into the system, increasing data availability by ~20% and enabling the compliance team to perform more comprehensive and accurate anti-money laundering checks”
Optimized Security Master system (C#, .Net Core, SQL Server) processing with concurrent SQL techniques,
reducing data redundancy by over 50%.
- There was SQL stored procedure that imported security data to one of our biggest tables in the database, constantly growing
- I was able to alter the stored procedure to be able delete redundant/unnecessary data while allowing for concurrency on the table itself
- Used LEAD function partitioning by client id and security id order by active date
- Looks at a group of security duplicates and grabs the lead security (based on client id and security id ordering by active date) and deletes everything
- Added UPDLOCK and READPAST hints for concurrency
- READPAST – other users don’t read the rows that are locked via UPDLOCK
- Capped deletion to only delete 1000 rows at a time
- One of our most used tables, didn’t want to lock the table for too long of a time
“I optimized the Security Master system (C#, .NET Core, SQL Server) by improving the processing of a high-traffic database table. I modified a SQL stored procedure to remove redundant data while ensuring concurrency, reducing data redundancy by over 50%. I used the LEAD function to identify and delete duplicate securities based on client ID, security ID, and active date. To maintain performance, I implemented UPDLOCK and READPAST hints, preventing other users from reading locked rows, and capped deletions to 1,000 rows at a time to avoid long table locks.”
Integrated third-party services (such as FactSet and ICE) to improve Security Master data accuracy by 10%.
- The current batch processing system relied on ICE to provide data so we can populate our own Security Master system
- Batch processing ran nightly
- Provided new datasets for numerous securities
- We started using FactSet which provided their own dataset
- Had to help integrate FactSet which provided more up to date securities data but also use ICE still where FactSet didn’t provide data on a particular security
- Integrated by processing their raw data from staging tables via stored procedures and merging the data into our master security table
- We used both providers, in case one didn’t have data on a particular security
- Important caveat: both third-party services provided different data for same security.
- Solution where I assigned weight to each provider’s depending on the securities asset class. For example, if equity then use FactSet, or if bond use Ice
“I integrated FactSet and ICE third-party services into our Security Master system, improving data accuracy by ~10%. I enhanced the nightly batch processing system to merge datasets from both providers, using FactSet for equities and ICE for bonds. I developed stored procedures to process and merge raw data into the master security table and implemented an asset-class-based weighting system to address discrepancies between providers, ensuring up-to-date and reliable data for numerous securities.”
Developed tailored PowerShell scripts for application deployment and updates by accessing client environments.
- In order to install/update our applications, I needed to provide custom PowerShell scripts to clients to either deploy or update the databases on the client’s machine
- Made sure SQL dacpac versions aligned with the PowerShell versions
- Made sure connection string was correct
- Made sure it included the correct databases to update
“I developed tailored PowerShell scripts for application deployment and updates, ensuring seamless integration with client environments. I customized scripts to deploy or update databases on client machines, ensuring SQL dacpac versions aligned with the PowerShell scripts. I also verified that the connection strings were accurate and that the correct databases were included for each update.”
Led development and training for the PFRD system (WPF, C#, .Net Framework, SQL Server) – significantly
reducing client processing times.
Served as the primary point of contact for support and training, becoming the go-to resource for
guidance, troubleshooting, and instruction.
- Typically, clients have to go through our application inputting data and aggregating and validation, section by section
- I created an ETL system that allowed clients to simply drop their raw data files
- ETL was able to automate the entirety of the app’s data aggregation and validation with a push of a button
“I led the development and training for the PFRD system (WPF, C#, .NET Framework, SQL Server), significantly reducing client processing times. Instead of manually inputting data section by section, I built an ETL system that allowed clients to drop raw data files, automating data aggregation and validation with a single button press. I also served as the primary point of contact for support and training, becoming the go-to resource for troubleshooting and instruction.”
Created automated parsing tool which corrected reoccurring mistakes throughout the CTM software within 1 second.
- There were these really long, really badly coded HTML files that essentially had typos.
- Long files that were thousands of lines long with incorrect nouns in files
- Coded up a console application that consumed the HTML files, parsed the text for the incorrect nouns, and utilized a hash table to correct the nouns
“I created an automated parsing tool to address recurring mistakes in the CTM software. The tool parsed long, poorly coded HTML files containing thousands of lines with incorrect nouns. I developed a console application that consumed the HTML files, identified the typos, and used a hash table to automatically correct them. This process corrected the issues within one second, significantly improving the accuracy and efficiency of the software.”
Managed CI/CD pipelines via TeamCity, ensuring efficient code releases and reliable deployment for
Regulatory Reporting clients.
- Whenever we had new versions of our software to give to our clients, I was in charge of deployment
- Involved merging new code into the master branch (via GitHub)
- Having TeamCity build and validate the code (points to master branch in GitHub)
- Updating or creating new TeamCity builds for each client that point to the new version
- Grabbing that newly created deployment package and installing it on our clients’ machines.
“I utilized CI/CD pipelines via TeamCity to manage code releases and deployment packages for the majority of Regulatory Reporting’s clients. I was responsible for merging new code into the master branch via GitHub, ensuring TeamCity built and validated the code. I created or updated TeamCity builds for each client to point to the new version, then grabbed the deployment packages and installed them on clients’ machines.”
Rapidly developed Excel VBA Macros to resolve legacy client system issues within 1 day, showcasing
adaptability and problem-solving prowess.
- Legacy client had issue with his excel macros
- Needed to fix it in order to complete his filing before the deadline
- I had to go into his excel macro and fix the VB code without any background knowledge of VB code
- Had to learn VB syntax on the fly had to learn how to debug/set breakpoints/add items to watchlists to see how the data was being manipulated by the VB code
- Within the same day, I managed to fix a couple of marcos written in VB fixing the data aggregation within the excel spreadsheet resulted in a successful filing for the client
Led in-house support and training of our PFRD system, becoming a go-to resource for instruction and questions
- Led hour long screenshare sessions with people from our managed services team that used our application to complete filings on behalf of some clients
- Main point of contact for managed services team, along with co-developers with any questions they had about the system
Migrated client databases to Azure Managed Instances, ensuring consistency and reliability
- Utilized SQL Server tools to back up database to bacpac files
- Utilized Azure tools to restore bacpac files onto Azure instance
- Pointed application to new Azure address
- Smoke tested to ensure critical functionality was still working
Developed products utilizing AWS S3 – storing and retrieving sensitive data