
Liz Fong has worked at MacDonald-Miller Facility Solutions, a full-service, design-build mechanical contractor out of the Pacific Northwest, for more than a decade. She spent about six years as a sheet metal worker before becoming operations integration general foreman, a position she’s held for nearly eight years. Recently, she was named the Sheet Metal and Air Conditioning Contractors’ National Association’s (SMACNA’s) 2025 Innovator of the Year.
The annual award recognizes a SMACNA member that pioneers and champions innovation and improvement both across their company as well as the industry at-large. In her role, Fong has a wide range of data- and process-related responsibilities, from managing MacDonald-Miller’s fabrication database and software utilization for its detailing department, fabrication shop and field remote entry users, to overseeing data generation and leveraging. In her nomination, Fong is described as having a commitment to excellence and progress. It reads, “Her expertise in deploying technology in real-world environments has not only established best practices but also paved the way for future advancements.”
We were fortunate enough to get a chance to sit down with Fong to learn a little bit more about what makes her such an effective innovator.
First, tell us a bit about MacDonald-Miller and your role there.
Yeah, I’ve had a couple of different titles over the years. The consistent part of it is that I’ve been working on our construction side of the business, and I’m really heavily involved in integration. A lot of that surrounds the integration of technology and how we utilize that across the company, but also the integration of the company as a whole. I look at how departments are integrated with each other; and because there’s some overlap with technology and data, etc, I really try to focus on making sure that we’re all on the same page. Depending on how we set something up, it might work better for one department and less for another, or create different kinds of pain(s) for different people. So, we look at that holistically to make sure we’re getting a net benefit out of it, even if it’s not quite evenly distributed or burdened.
As Innovator of the Year, can you tell us a little bit about the technologies you’ve deployed?
A lot of the tech I’m involved with is with our VDC products, so a lot of Autodesk products. We recently implemented Stratus, for example. I’ve also been involved with leveraging a lot of organizational pro products, things that facilitate better general business operations like standard Microsoft SharePoint, Power BI, and Smartsheet. I also do a fair amount of custom solutions and automations.
Some of that custom work is happening via third party solutions, like Microsoft’s Power Apps or Power Automate platforms, and some are built inhouse. MacDonald-Miller in general does a lot of house-built solutions. I don’t manage all of them, of course, but I do a lot on the construction side—those that are being used directly by our people in the field or in detailing VDC or at our shop.
For those custom solutions, are you building them personally or does MacDonald-Miller have a team of developers?
It’s a mixture, but I have built a lot of them personally. We do a lot of custom Revit add-ins. I built a lot of those. Also, the apps that support our paperless workflows at the sheet metal shop. Those are apps I built on the Power Apps platform.
Have you always had that skillset?
It’s interesting, I didn’t code at all before MacDonald-Miller. Those are skills I’ve learned on the job. It started after I began managing VDC software for our detailing department, when we already had a lot of custom AutoCAD solutions. We moved over to Revit about four years ago.
When I took over that role, I started learning code. As a detailer, I already knew what the code was supposed to do. So, when things broke, I was able to kind of parse the coding language and figure out the breakdown. It just kind of grew over time.

That’s a great segue into my next question: When you’re identifying new tech to implement, whether it’s via a custom built application or something off the shelf, how are you first identifying the problems that need to be solved?
It’s a mix of things. There’s a balance between being connected with leadership and organizational objectives. In trying to understand our goals, whether it’s this year or five down the round, I really look to the end users. I’m working directly with the people in the field, at the shop or in VDC. I hear a lot of their struggles and get to understand their priorities.
Is it a struggle to reconcile potential conflicts between what management and end users want?
I have to do my best to liaise between what the end users want and need and what leadership and the organization wants and needs. I have to make sure that if the organization doesn’t see something as a priority but it might be really beneficial to the end user or a specific department, that I elevate that issue as best I can. And vice versa. If the end user doesn’t understanding why the organization or management wants to go in a specific direction, I have to do my best to explain how the decision will ultimately benefit them by benefiting the organization.
What is your vetting process for a particular solution, to figure out whether it’s a good fit?
It first depends on the size of the solution that we’re looking at. If I’m going to be putting my own stamp of approval on a solution, or I know that I’ll be heavily involved in it were it to be adopted, my first step will be to give it my own initial look. Then we do a sales demo, where other stakeholders can ask questions and get to know it better. If interest persists beyond that, we will obviously want to do more vetting, because everything works in a demo. For bigger solutions, we’ll want to do an actual trial. For smaller solutions, we can test it in a controlled environment.
But even when we do that, it still doesn’t mean the solution’s perfect. You’ve got to look at it from all angles. There’s the obvious problem solving aspect of it, but you also have to look at how user friendly it is. It’s great that it can solve this problem, but if nobody can use it to actually solve the problem because it’s so complicated and complex then nobody’s going to use it and it’s not going to solve the problem. Then you have the management perspective: what does deployment look like; do you need licenses? There’s a lot of different gateways a solution has to go through.
Unlike a lot in the tech side of construction, you have a lot of experience in the field, being a sheet metal worker before your current role.
I’m actually still a sheet metal worker, technically.
How does that experience and your understanding of what it is to do the actual work influence your thinking about innovation and implementation?
It’s given me a uniquely diverse perspective, I think; and that’s really contributed to the, I guess, success I’ve had in my role. I’ve worked a lot with data and modeling data, from facilitating processes to actual database management. I’ve managed our Autodesk fabrication database. So, from managing the content to facilitating how the content gets modeled and ultimately influences workflows as well as how that content gets disseminated to the end users, I’ve kind of been on all sides. I’ve benefited from the data in the shop, in detailing, and I’ve been on the database management side of things. Then also the developer side of things.
I’m able to facilitate conversations between decision makers and the end user. I’m a little bit of a translator, which has been very beneficial to our successful implementation of technology.
To that point, once you’re past the demo or controlled test phase, what are some best practices you’ve discovered in implementing tech at scale, whether large or small?
I tend to start small. I identify what are the priorities of this particular software solution. I ask, “why did we get this to begin with?” Because a lot of times, softwares or solutions, they might have a hundred different features, but we really only got it because of certain ones. And even of those, maybe, 20 features, there could be only five that will account for our biggest return. So, I try to focus on the initial efforts and implementation, rather than just try to attack everything at once.
On top of that, we also start with a beta group for deployment, rather than have everyone hop on at once. I’ll kick a solution out to a handful of users for a project deployment, where we can actually do a little bit of testing. From an end user perspective, we want to make sure we get that feedback to ensure that it’s working. Then we can make any small tweaks we need to before then rolling it out to the larger group.
Do you have set timelines for how long those beta periods last?
My timelines have always been loose, to a degree. It’s not often something is like, “we need this right now” or “we’re going to shut down if we don’t have this rolled out.” Time does influence it to a degree, though, in terms of what I prioritize, because there might be a lot of initiatives going on at once. But, if the cost benefit of this or that particular piece of tech or workflow is such that we want to prioritize it to meet a particular deliverable deadline then we might shift priorities and focus.

Can you give us an example of when you were, maybe, sure of a technology, or at least betting on it, and it just didn’t work out?
We did have something back when we were transitioning from AutoCAD to Revit. There’s just some things for which Revit doesn’t have great solutions. So, we were looking specifically for a spooling tool that would allow us to create spools, assemblies, the sheets and all the drawings. We were comparing a couple of add-ins and ended up choosing one that wasn’t necessarily the best spooling solution; but it was the best Revit add-in solution because of the other tools that it had. Since we were just transitioning to Revit at the time—a time when I wasn’t coding in C—we didn’t have Revit customizations. So, whereas in AutoCAD we could lean on ourselves to fill a lot of those gaps, in Revit we couldn’t.
Anyway, all of those things that we had identified as the added benefits, which led to us choosing this particular add-in, they just systematically broke over the first year. We had done a lot of vetting. We tested it. Things were working for a few months, but then one upgrade and things started breaking or tools became so slow that nobody wanted to use them. It didn’t seem like the provider was interested at all in making the necessary improvements, whether it was to speed or to stop tools from breaking that were previously working.
We were having to basically pivot and redesign entire workflows or recreate families to work with these updates. And there were some pretty significant breaks.
And you were talking with the provider throughout this process?
So much communication was going on: technical support, tickets, etc. And there was no improvement.
What was the solution? How did you phase out the failed tech?
That was the first year of the transition. The second year, I resolved that we were just going to create our own solutions. And so over the course of that next year, we systematically built our own in-house solutions for most of the functions that we were looking to this other add-in for.
When we were ready to basically phase out the failed add-in, we went with one of the solutions that we were looking at the first time around, Stratus, to fill in the remaining gaps. So, we went through another round of vetting. And really it was revetting, as that was our third look at Stratus—which didn’t have the VDC tools we were originally looking for; but by then we’d built our own.
When you were going through this process, what was communication with the end user like? Was there backlash from the headaches?
I mean, there was certainly a level of frustration, but they had been involved in the process. They had been involved in the testing and vetting. They knew at one point it was a viable tool, and it did work for a time. I think everybody kind of recognized that we didn’t make the decision in a vacuum. It wasn’t something thrust on them. That helped squash any blaming. We were all collectively frustrated.
As we came up with dependable solutions, everyone was really happy when we were able to just kick it to the curb completely.
We’re bumping up on time, so I have just one final question: What new or emerging innovations have you excited?
I’m excited to see I guess where AI goes. I think it’s a bit of a ways out, but I’ve been reading a lot about AI. Trying to see where it’s going to go. Most of us are currently using it like chat GPT, to kind of help us with little tasks and functions here and there. It’ll be interesting to see what kind of enterprise-level organizational solutions come out. But, I’m also a little concerned.
Well, not concerned, but I’ve been thinking a lot about the long-term implications of becoming more efficient at processing copious amounts of bad data. Some of the startups I’ve talked with are focused on digesting construction documents and spitting out key points and summaries to the various stakeholders. But it has me thinking about why we have these giant, possibly overly complicated construction documents to begin with. It makes me wonder if some of these AI solutions are a band aid for what are already bad processes.
I’d be interested to see AI tech startups work a little bit more upstream, focusing on solving the problem of how we create and disseminate quality data, rather than just everyone just pushing out any and all data, even unnecessary data. If you put bad data in, you get bad results out.
Discussion
Be the first to leave a comment.
You must be a member of the BuiltWorlds community to join the discussion.