5 Key Takeaways From the 2026 AI/ML Conference

light shining over a crowded conference
Phil Bernstein delivers the 2026 AI/ML Conference’s keynote address.

Artificial intelligence has been described as the “most transformational technology in human history.” It’s also been called a “societal-scale risk,” one that potentially endangers humanity’s very existence. Whether revolutionary or existentially threatening, there is no debate about one thing: AI has awesome potential.

At the 2026 AI/ML Conference, we challenged experts from throughout the AEC industry to explore that potential (and its risks), to determine how it can best be harnessed to improve not only business decisions but the built environment as a whole. While we can’t recount every lesson and insight shared or meaningful conversation had, we have put together a list of some of our top takeaways from the event.

1. AI Can Be Dangerous In Three Dimensions 

When a raccoon showed a determination to mount Yale School of Architecture Deputy Dean and Professor in Practice and keynote speaker Phil Bernstein’s roof, he turned to AI to help him build a slightly electrified cattle fence. While AI platforms ChatGPT and Claude admittedly provided Bernstein with some good pointers for buildings materials and power supplies (and to keep people with pacemakers away from the system) the actual plans fell short of feasible.

“Three of (the designs) would probably have set my house on fire,” said Bernstein of the options ChatGPT churned out.

AI experiences a sharp drop off when thinking shifts from the conceptual to physical implementation, he pointed out.

“It’s very difficult for these tools to reason in systems, to reason in three dimensions, and reason about the relationship between parts.”

What we’re missing, Bernstein went on, is a “theory of correctness.” While the images AI produce may be pretty or provocative, when it comes to three-dimensional accuracy the technology is simply not yet dependable. And according to the professor, “I don’t think we’re anywhere near that theory of correctness.”

2. AI Scaling Requires Clear Organizational Structure and Cross-Functional Alignment

This year’s conference was kicked off with a meeting of BuiltWorlds’ AI/ML Preparedness Working Group, a closed-door, members-only session led by two guest speakers from Mortenson: Brian Nahas, director of artificial intelligence, and David Grosshuesch, senior manager of data analytics. During the discussion, there was a consensus that successful AI scaling requires clear cross-functional ownership and stronger enterprise data strategies. The group agreed it was necessary for businesses in the built environment to move beyond siloed, project-level data and advance adoption through structured governance and incremental, practical implementation.


Two men deliver a presentation to a small group sitting in a U-shape configuration.
Brian Smith (center) and Eric Cylwik (left) lead the Equipment & Robotics Specialty Research Track Working Group in a conversation about effective implementation.

3. Structured and Unstructured Data Are Both Important

It’s popular to talk about the importance of structured data in the AEC industry, where data has been for so long neglected. And as the panelists in our “Quick Wins, Lasting Change: Founders Showcasing AI’s Practical Power in Construction” discussion agreed, it is important. But unstructured data is also important (at least as it relates to the use of LLMs).

“For you to make decisions effectively, you need context from both unstructured to structured data,” said guest speaker Rohan Jawali, founder and CEO of Joist AI, who previously worked in operations, VDC and innovation for Hensel Phelps.

Jawali’s solution, he shared, was to create data ingestion pipelines to create a knowledge graph around core object types. He did this while also connecting the structured dataset and reconciling it with the unstructured data to “reinforce the knowledge graph.”

“If you can figure out a way to get both of these datasets completely at the right runtimes to the LLMs,” he said, “you can (achieve) solid outcomes (as it relates to) accuracy.”

4. Physical AI and Autonomy Adoption Must Align With Core Values and Operational Reality

The closed-door Equipment & Robotics Working Group session that took place on day one of the conference and featured guest speakers Eric Clwik, director of innovation for Sundt Construction, and Brian Smith, business development partner for Bedrock Robotics, tackled one of our industry’s (and really all industries’) most persisting and prevalent challenges: how do you successfully introduce a new technology to your team? For this particular question, the “new technology” was AI-power/-enhanced robotics and autonomous equipment.

The group highlighted the importance of ensuring human oversight, prioritizing team buy-in, and phased pilots. The latter was particularly relevant to the session’s guest speakers, who described a significant ramp up process prior to Bedrock deploying autonomous machines on Sundt jobsites. The approximately two-month process included collecting data on equipment functions and getting a Bedrock operator on-site before beginning the autonomous functions of the equipment.


Two women and one man sitting in plush chairs on stage speaking
Stefanie Guerra (right), Tonya Custis (center), Pierce Reynoldson (left) and Jean-Pierre Trou (not pictured) share the stage during the “Implemeting AI in Planning & Design: Managing Risk and Quality” panel on Day two of the 2026 AI/ML Conference.

5. Embrace AI Personally To Implement It Professionally

Several years back, when AI tools were first entering the AEC space, the only question that seemed to matter was: how can we fit this into our workflows? It was solely a question of business use cases. That thinking is now somewhat shifting—or rather evolving—as we saw at this year’s AI/ML Conference. Instead of thinking of AI as something uniquely for business, a new emphasis is being placed on personal use and the power that can have to inspire business implementation.

From 2021 to 2025, the share of American’s more concerned than excited about the use of AI has grown. At least by observation, there appears to be a widespread fear or reluctance to embrace the technology. Some believe that can be overcome with personal use.

In conversation with leaders from Procore, Document Crunch and Orion Construction, Andrew Stanger, senior corporate counsel for Acciona—during the “Cutting Through the Noise: How Will AI Influence Existing Workflows?” panel—asked guests pointedly if “personal use guides your professional use?”

“Absolutely,” answered Orion Construction Senior Business Development Manager Christopher Velguth, “I think it starts with the acceptance of it.”

Velguth was someone initially hesitant to use AI—”a fear of the unknown,” he described it. But it was that fear that inspired him to use it more in his personal life.

“I realized, okay, this isn’t the scary monster anymore,” explained Velguth, saying he quickly found good use for it in data analysis. “Just doing that in my personal world has translated to business.”

Over time, AI became part of Velguth’s norm, which resulted in a sort of “cross-pollination” from personal to professional use. That mindset was a common thread throughout the 2026 AI/ML Conference.

“Everything about AI is change and innovation and newness,” said The Falcon Group CTO Stefani Guerra during the “Implementing AI in Planning & Design: Managing Risk and Quality” panel. “If you’re not … excited by those things, I think that becomes such a barrier to having actual adoption.”