NRC Flashcards

(6 cards)

1
Q
  • Conducted research on metal binder jetting and powder metallurgy technology, leveraging data analysis tools such as Pandas and Power BI to analyze complex datasets, visualize trends, and generate comprehensive reports to support ongoing research and development initiatives.
A

in that role, my primary focus was on the data acquisition side of our metal binder jetting and powder metallurgy research.

A lot of the raw data was spread across different sources — from lab equipment outputs, CSVs, and manually recorded logs — so I helped build a lightweight pipeline to collect, standardize, and consolidate that data for further analysis.

I used Python and Pandas to automate the extraction and transformation process — cleaning up inconsistencies, merging datasets, and structuring them in a way that made them easier for our engineers to work with.

In terms of analysis and presentation, I collaborated closely with the R&D engineers to understand what metrics were important to them, and helped shape the data accordingly.

I also assisted with creating visual dashboards in Power BI that helped surface trends, like how process parameters were affecting material properties over time. This was mostly helping them with gateway connections and making sure PBI was able to get data from the data sources I built.

This work was really about bridging the gap between raw research data and usable insights. It gave the team better visibility into what was happening across experiments, and helped them make more data-driven decisions during development.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q
  • Designed and implemented data pipelines and developed APIs to support research initiatives, including preprocessing, integration, and analysis of large scientific datasets using Python, Java and PostgreSQL.
A

During my time with the National Research Council in the Advanced Manufacturing Division, I was working in a SCADA environment where we were collecting and analyzing large volumes of manufacturing sensor data. My role focused on enabling data-driven research by developing robust data pipelines and APIs that would make that sensor data more accessible and usable for researchers and engineers.

I primarily used Python for building ETL workflows. These pipelines would ingest raw data from our SCADA systems—then preprocess it to remove noise, align timestamps, and standardize formats. For the integration layer, I used PostgreSQL to design schemas that were optimized for querying time-series and metadata.

To support data access, I developed internal APIs that served as an abstraction layer for researchers and engineers. These APIs allowed them to query historical and real-time data without needing to interact directly with the PLCs or raw datasets.

In our setup, while Python handled most of the data ingestion and preprocessing, I used Java to build a lightweight middleware application that ran on an edge device located near the manufacturing floor.

This was software that used a GUI that displayed live machine data and system status. This allowed on-site engineers to verify data integrity, see alerts, or restart polling jobs—all without needing access to the main system.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q
  • Designed and implemented asynchronous event handlers to process control system alerts and sensor triggers, enabling real-time routing of critical events to downstream analytics and research dashboards.
A

With the NRC, a big part of our challenge was dealing with real-time events coming from industrial equipment—things like fault alerts, threshold breaches, and sensor anomalies.

To address that, I designed and implemented a system of asynchronous event handlers in Python. These handlers listened for events from the control system, typically triggered by signals from PLCs, and then routed those events to appropriate downstream systems.

We used libraries like asyncio and aiohttp for event processing.

Depending on the type of event, it would either:

  • Trigger a notification through teams,
  • Get logged in a time-series database, or
  • Be visualized in near real-time on Power BI.

The **Power BI dashbboard **was particularly useful for identifying patterns or anomalies during experimental runs.

Overall, this part of the project helped bridge the gap between raw machine-level signals and actionable insights for the research teams, and it was a great opportunity to apply both software engineering principles and domain-specific understanding of industrial systems.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q
  • Contributed to the development of an automated thermal test bench, integrating a robotic arm to evaluate the structural integrity of 3D-printed metal specimens through controlled high-temperature exposure and rapid cooling.
A

That was one of the more unique and interdisciplinary projects I worked on during my time with the NRC’s Advanced Manufacturing Division. The goal was to develop an automated thermal test bench to evaluate how 3D-printed metal specimens responded to thermal stress—specifically, repeated high-temperature exposure followed by rapid cooling.

The team had mechanical engineers focused on the furnace and cooling systems, and robotic specialists who were programming the motion control for the robotic arm.

I contributed on several fronts:

  • I implemented data acquisition and logging systems that tracked temperature profiles, timing accuracy, and system status during each test cycle. This data was critical for both validating the mechanical process and feeding into later material analysis.
  • From a system architecture perspective, I helped design a modular software framework that allowed the mechanical and controls engineers to tweak hardware parameters (like temperature targets or motion sequences) through a simple command line interface, without touching code.

This project really showcased how software can bridge the gap between mechanical systems and research objectives. I was the one ensuring that the whole system not only worked reliably, but also produced clean, well-structured data for downstream analysis.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q
  • Worked closely with engineering teams and other stakeholders to understand data requirements, support ongoing projects, and provide technical expertise in our SCADA operations, data handling and analysis.
A

Definitely. In my role at the NRC’s Advanced Manufacturing Division, I worked very closely with engineering teams. A big part of my contribution was acting as a bridge between the data systems and the operational or research goals.

Many of the ongoing projects involved complex equipment and SCADA systems that generated a lot of sensor and control data. My job was to help stakeholders articulate what data they needed, figure out whether that data was available or could be captured, and then build the pipelines or tooling to make it accessible and meaningful.

From a technical side, I was responsible for accessing SCADA data, typically from PLCs over Modbus TCP or OPC, and building systems in Python to extract, clean, and structure that data—often using PostgreSQL as our storage backend. I’d also write scripts or small applications to automate repetitive tasks, like scheduled data pulls or transformations, depending on what each team needed.

For example, if a researcher wanted to analyze thermal cycling data from a test rig, I’d help them define the key parameters, and set up data collection routines.

What I really enjoyed was being in a position to combine my software skills with domain understanding to make sure the data wasn’t just being collected—but was actually useful. Whether it was helping someone debug a signal loss issue, or ensuring timestamp precision for later analysis, I was always working to make the system more transparent and responsive to the team’s needs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q
  • Delivered a detailed report summarizing research findings and security recommendations, supporting discussions on secure software engineering practices.
A

Sure. As part of a broader initiative to support secure software engineering practices, I delivered a report that focused on identifying some of the security risks I observed during my time working with our internal systems and codebase, laying out a clear picture of where we stood and what could be done to improve things moving forward.

One of the first issues I noticed was the use of hardcoded credentials in several scripts and configuration files. These were often embedded for convenience during testing, but they posed a serious risk if accidentally committed to version control or left in production environments. In the report, I recommended centralizing secrets management—using environment variables at the very least, and ideally implementing a secure, dedicated secrets vault that supports access control and auditing.

Another issue was that a number of internal tools and third-party libraries hadn’t been updated in quite some time. Some of these dependencies had known vulnerabilities, and even though they weren’t actively exploited, they created unnecessary risk. I highlighted this in the report and suggested setting up a routine process for reviewing dependencies—essentially building updates into the development cycle, so it doesn’t become an afterthought.

Finally, I stressed the importance of penetration testing, especially given that we operated in a zoned network environment with segmented access between different layers like the control network, corporate network, and internet-facing services. While the segmentation was good in theory, I pointed out that it hadn’t been stress-tested. So, I recommended that future planning include internal and external penetration tests—not just to check for vulnerabilities, but to simulate how an attacker might move laterally if they gained access to a less secure zone.

Overall, the report helped kick-start conversations around not just fixing individual problems, but also building more secure habits and processes into our workflows.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly